**Collision-Based Testers are Optimal for Uniformity and Closeness**by Ilias Diakonikolas, Themis Gouleakis, John Peebles, and Eric Price

May 6, 2019

We study the fundamental problems of (i) uniformity testing of a discrete distribution, and (ii) closeness testing between two discrete distributions with bounded $\ell_2$-norm. These problems have been extensively studied in distribution testing and sample-optimal estimators are known for them [17, 7, 19, 11]. In this work, we show that the original collision-based testers proposed for these problems [14,3] are sample-optimal, up to constant factors. Previous analyses showed sample complexity upper bounds for these testers that are optimal as a function of the domain size $n$, but suboptimal by polynomial factors in the error parameter $\epsilon$. Our main contribution is a new tight analysis establishing that these collision-based testers are information-theoretically optimal, up to constant factors, both in the dependence on $n$ and in the dependence on $\epsilon$.

**Non-commutative computations: lower bounds and polynomial identity testing**by Guillaume Lagarde, Guillaume Malod and Sylvain Perifel

September 10, 2019

In the setting of non-commutative arithmetic computations, we define a class of circuits that generalize algebraic branching programs (ABP). This model is called

*unambiguous*because it captures the polynomials in which all monomials are computed in a similar way (that is, all the*parse trees*are isomorphic).We show that unambiguous circuits of polynomial size can compute polynomials that require ABPs of exponential size, and that they are incomparable with skew circuits.

Generalizing a result of Nisan on ABPs, we provide an exact characterization of the complexity of any polynomial in our model, and use it to prove exponential lower bounds for explicit polynomials such as the determinant.

Finally, we give a white-box deterministic polynomial-time algorithm for polynomial identity testing (PIT) on unambiguous circuits over ℜ and ℂ

**H-wise Independence**by Ishay Haviv and Michael Langberg

November 19, 2019

For a hypergraph $H$ on the vertex set $\{1,\ldots,n\}$, a distribution $D = (D_1,\ldots,D_n)$ over $\{0,1\}^n$ is {\em $H$-wise independent} if every restriction of $D$ to indices which form an edge in $H$ is uniform. This generalizes the notion of $k$-wise independence obtained by taking $H$ to be the complete $n$ vertex $k$-uniform hypergraph. This generalization was studied by Schulman (STOC 1992), who presented constructions of $H$-wise independent distributions that are

*linear*, i.e., the samples are strings of inner products (over $\mathbb{F}_2$) of a fixed set of vectors with a uniformly chosen random vector. Let $\ell(H)$ denote the minimum possible size of a sample space of a uniform $H$-wise independent distribution. The $\ell$ parameter is well understood for the special case of $k$-wise independence. In this work we study the notion of $H$-wise independence and the $\ell$ parameter for general graphs and hypergraphs. For graphs, we show how the $\ell$ parameter relates to standard graph parameters (e.g., clique number, chromatic number, Lovász theta function, minrank). We derive algorithmic and hardness results for this parameter as well as an explicit construction of graphs $G$ for which $\ell(G)$ is exponentially smaller than the size of the sample space of any linear $G$-wise independent distribution. For hypergraphs, we study the problem of testing whether a given distribution is $H$-wise independent, generalizing results of Alon et al. (STOC 2007).

Volume 2018 Abstracts

Volume 2018 Published articles

Janos Simon