Holey graphs: very large Betti numbers are testable
ABSTRACT. We show that the graph property of having a (very) large $k$-th Betti number $\beta_k$ (over $\mathbb{Z}_2$) for constant $k$ is testable with a constant number of queries in the dense graph model.
More specifically, we consider a clique complex defined by an underlying graph and prove that for any $\varepsilon >0$, there exists $\delta(\varepsilon,k)>0$ such that testing whether $\beta_k \geq (1-\delta) d_k$ for $\delta \leq \delta(\varepsilon,k)$ reduces to tolerantly testing $(k+2)$-clique-freeness, which is known to be testable.
This complements a result by Elek (2010) showing that Betti numbers are testable in the bounded-degree model.
For our result we consider simplicial complexes as combinatorial objects, and combine matroid theory and the graph removal lemma.
Tolerant Testing and Distance Estimation for Distributions Under Memory Constraints
ABSTRACT. We investigate tolerant testing and distance estimation problems in distribution testing when the amount of memory is limited. In particular, our aim is to provide a good estimate of the distance measure (in total variation distance) using constant memory words. Though the problem is well understood in the standard access model with an $\Omega(\frac{n}{\log{n}})$ lower bound, it has not been explored under memory constraints. We introduce an algorithm for estimating distance to uniformity using constant memory words, and extend the approach for estimating the distance to a known distribution. The sample complexity of our algorithm for estimating distance to uniformity is almost tight when the available memory is constant. Additionally, we present smooth trade-offs in sample-space for the problem of tolerant closeness testing across various distance measures.
We propose an algorithm that requires $O(\frac{n}{\epsilon^{6}})$ queries for estimating distance to uniformity, and use the approach to develop an algorithm that needs $O(\frac{n}{\epsilon^{7}})$ queries for estimating distance to a known distribution. These algorithms operate with $O(1)$ memory words. Our algorithms follow $S\cdot m=\Omega(\frac{n\log{n}}{\epsilon^4})$ lower bound established by \cite{DGKR19} for $m=O(1) \ \text{words}$ or $m=O(\log{n})$ bits where $S$ denotes the sample complexity. Additionally, we present an $(\epsilon,c\epsilon)$ tolerant closeness tester in terms of $\ell_2$ distance for all $c\geq 3$, which requires $O(\frac{n^3}{m\epsilon^2})$ samples and $O(m)$ bits of memory for $\log{n}/\epsilon^4 \leq m\leq n/\epsilon^2$. Furthermore, we outline an $O(\frac{n^2\epsilon_1^2}{m\epsilon_2^4})$ trade-off for $(\epsilon_1,\epsilon_2)$ tolerant closeness testing problem using $\ell_1$ distance, where $\log{n} \leq m\leq n\log{n}$ is the available memory in bits.
Exact characterizations of non-commutative algebraic complexity without homogeneity
ABSTRACT. Algebraic complexity theory studies the number of arithmetic operations necessary to compute different polynomials. As in the case of Boolean complexity, there are many open questions: indeed, there are several natural polynomials for which we believe, but are unable to prove, that no efficient computation is possible. When contrasted with the difficulty to obtain lower bounds in general for the main computation model of arithmetic circuits, Nisan's 1991 result is striking: it gives an exact characterization of the complexity of computing any given homogeneous polynomial with an homogeneous algebraic branching program, in the non-commutative setting. As noticed in 2020 by Fijalkow et al., Nisan's theorem can be derived from results by Fliess (1974) in the field of non-commutative formal series. We build on this point of view to extend Nisan's theorem to general (not necessarily homogeneous) polynomials computed by general algebraic branching programs and again obtain an exact characterization. Fijalkow et al. showed how to apply properties of formal tree series to get a similar statement for homogeneous non-associative arithmetic circuits and we also extend this characterization to general non-associative circuits.
ABSTRACT. The A-hierarchy is a parametric analogue of the polynomial hierarchy in the context of paramterised complexity theory.
We give a new characterisation of the A-hierarchy in terms of a generalisation of the SUBSET-SUM problem.
Reachability in temporal graphs under perturbation
ABSTRACT. Reachability and other path-based measures on temporal graphs can be used to understand spread of infection, information, and people in modelled systems. Due to delays and errors in reporting, temporal graphs derived from data are unlikely to perfectly reflect reality, especially with respect to the precise times at which edges appear. To reflect this uncertainty, we consider a model in which some number $\zeta$ of edge appearances may have their labels perturbed by $\pm\delta$ for some~$\delta$. Within this model, we investigate temporal reachability and consider the problem of determining the maximum number of vertices any vertex can reach under these perturbations. We show this problem to be intractable in general but efficiently solvable when $\zeta$ is sufficiently large. We also give algorithms which solve this problem in several restricted settings. We complement this with some contrasting results concerning the complexity of related temporal eccentricity problems under perturbation.
ABSTRACT. Graph Exploration problems ask a searcher to explore an unknown environment. The environment is modeled as a graph, where the searcher needs to visit each vertex beginning at some vertex. Treasure Hunt problems are a variation of Graph Exploration, in which the searcher needs to find a hidden treasure, which is located at a designated vertex.
Usually these problems are modeled as online problems, and any online algorithm performs poorly because it has too little knowledge about the instance to react adequately to the requests of the adversary. Thus, the impact of a priori knowledge is of interest. One form of a priori knowledge is an unlabeled map, which is an isomorphic copy of the graph. We analyze Graph Exploration and Treasure Hunt problems with an unlabeled map that is provided to the searcher. For this, we formulate decision variants of both problems by interpreting the online problems as a game between the online algorithm (the searcher) and the adversary. The map, however, is not controllable by the adversary. The question is whether the searcher is able to explore the graph completely or find the treasure for all possible decisions of the adversary.
We analyze these games in multiple settings, with and without costs on the edges, on directed and undirected graphs and with different constraints (allowing multiple visits to vertices or edges) on the solution. We prove PSPACE-completeness for most of these games. Additionally, we analyze the complexity of related problems that have additional constraints on the solution.
ABSTRACT. We present several advancements in search-type problems for fleets of mobile agents operating in two dimensions under the wireless model. Potential hidden target locations are equidistant from a central point, forming either a unit-radius disk (infinite possible locations) or regular polygons (finite possible locations) inscribed in a unit-radius disk. Building and extending on the foundational disk evacuation problem, the disk priority evacuation problem with $k$ Servants, and the disk $w$-weighted search problem, we make improvements on several fronts. First we establish new upper and lower bounds for the $n$-gon priority evacuation problem with $1$ Servant for $n \leq 13$, and for $n_k$-gons with $k=2, 3, 4$ Servants, where $n_2 \leq 11$, $n_3 \leq 9$, and $n_4 \leq 10$, offering tight or nearly tight bounds. The only previous results known were a tight upper bound for $k=1$ and $n=6$ and lower bounds for $k=1$ and $n \leq 9$. Second, our work improves the best lower bound known for the disk priority evacuation problem with $k=1$ Servant from $4.46798$ to $4.64666$ and for $k=2$ Servants from $3.6307$. Third, we improve the best lower bounds known for the disk $w$-weighted group search problem, significantly reducing the gap between the best upper and lower bounds for $w$ values where the gap was largest. These improvements are based on nearly tight upper and lower bounds for the $11$-gon and $12$-gon $w$-weighted evacuation problems, while the previous studies were limited only to lower bounds and only to $7$-gons.
ABSTRACT. Identifying and mitigating the spread of fake information is a challenging issue that has become dominant with the rise of social media. We consider a generalization of the Domination problem that can be used to detect a set of individuals who, once immunized, can prevent the spreading of fake narratives. The considered problem, named {\em Distance Vector Domination} generalizes both distance and multiple domination, at individual (i.e., vertex) level. We study the parameterized complexity of the problem according to several standard and structural parameters. We prove the W[1]-hardness of the problem with respect to neighborhood diversity, even when all the distances are equal to $1$. Then we give fixed-parameter algorithms for some variants of the problem and parameter combinations.
ABSTRACT. We study the classical problem of computing geometric thickness, i.e., finding a straight-line drawing of an input graph and a partition of its edges into as few parts as possible so that each part is crossing-free. Since the problem is NP-hard, we investigate its tractability through the lens of parameterized complexity. As our first set of contributions, we provide two fixed-parameter algorithms which utilize well-studied parameters of the input graph, notably the vertex cover and feedback edge numbers. Since parameterizing by the thickness itself does not yield tractability and the use of other structural parameters remains open due to general challenges identified in previous works, as our second set of contributions we propose a different pathway to tractability for the problem: extension of partial solutions. In particular, we establish a full characterization of the problem's parameterized complexity in the extension setting depending on whether we parameterize by the number of missing vertices, edges, or both.
ABSTRACT. We study the visual complexity of animated transitions between point sets. Although there exist many metrics for point set similarity, these metrics are not adequate for our purpose, as they typically treat each point separately. Instead, we propose to look at translations of entire subsets/groups of points to measure the visual complexity of a transition between two point sets. Specifically, given two labeled point sets A and B in R^d, the goal is to compute the cheapest transformation that maps all points in A to their corresponding point in B, where the translation of a group of points counts as a single operation in terms of complexity. In this paper we identify several problem dimensions involving group translations that may be relevant to various applications, and study the algorithmic complexity of the resulting problems. Specifically, we consider different restrictions on the groups that can be translated, and different optimization functions. For most of the resulting problem variants we are able to provide polynomial time algorithms, or establish that they are NP-hard. For the remaining open problems we either provide an approximation algorithm or establish the NP-hardness of a restricted version of the problem. Furthermore, our problem classification can easily be extended with additional problem dimensions giving rise to new problem variants that can be studied in future work.
ABSTRACT. Computing a Euclidean minimum spanning tree of a set of points is a seminal problem in computational geometry and geometric graph theory. We combine it with another classical problem in graph drawing, namely computing a monotone geometric representation of a given graph. More formally, given a finite set~$S$ of points in the plane and a finite set~$\cal D$ of directions, a geometric spanning tree~$T$ with vertex set~$S$ is \emph{${\cal D}$-monotone} if, for every pair $\{u,v\}$ of vertices of~$T$, there exists a direction $d \in \cal D$ for which the unique path from~$u$ to~$v$ in~$T$ is monotone with respect to $d$. We provide a characterization of ${\cal D}$-monotone spanning trees. Based on it, we show that a ${\cal D}$-monotone spanning tree of minimum length can be computed in polynomial time if the number $k=|{\cal D}|$ of directions is fixed, both when (i)~the set ${\cal D}$ of directions is prescribed and when (ii)~the objective is to find a minimum-length ${\cal D}$-monotone spanning tree over all sets ${\cal D}$ of $k$ directions. For $k = 2$, we describe algorithms that are much faster than those for the general case. Furthermore, in contrast to the classical Euclidean minimum spanning tree, whose vertex degree is at most six, we show that for every even integer~$k$, there exists a point set~$S_k$ and a set $\cal D$ of $k$ directions such that any minimum-length $\cal D$-monotone spanning tree of $S_k$ has maximum vertex degree~$2k$.
ABSTRACT. We consider hypergraph visualizations that represent vertices as points in the plane and hyperedges as curves passing through the points of their incident vertices. Specifically, we consider several different variants of this problem by (a) restricting the curves to be lines or line segments, (b) allowing two curves to cross if they do not share an element, or not; and (c) allowing two curves to overlap or not. We show ∃R-hardness for six of the eight resulting decision problem variants and describe polynomial-time algorithms in some restricted settings.