Chicago Journal of Theoretical Computer Science

Volume 2015

Article 2

Published by the Department of Computer Science, The University of Chicago.


Entropy of weight distributions of small-bias spaces and pseudobinomiality

Louay Bazzi
American University of Beirut
Beirut, Lebanon
louay.bazzi AT aub DOT edu

July 10, 2015

Abstract

A classical bound in information theory asserts that small $L_1$-distance between probability distributions implies small difference in Shannon entropy, but the converse need not be true. We show that if a probability distribution on $\{0,1\}^n$ has small-bias, then the converse holds for its weight distribution in the proximity of the binomial distribution. Namely, we argue that if a probability distribution $\mu$ on $\{0,1\}^n$ is $\delta$-biased, then $\| \overline{\mu} - bin_n \|_1^2 \leq (2\ln{2})(n \delta + H(bin_n) - H(\overline{\mu}))$, where $\overline{\mu}$ is the weight distribution of $\mu$ and $bin_n$ is the binomial distribution on $\{0,\ldots, n\}$. The key result behind this bound is a lemma which asserts the non-positivity of all the Fourier coefficients of the log-binomial function $L:\{0,1\}^n \rightarrow {\mathbf R}$ given by $L(x) = \lg{bin_n(|x|) }$. The original question which motivated the work reported in this paper is the problem of explicitly constructing a small subset of $\{0,1\}^n$ which is $\epsilon$-pseudobinomial in the sense that the weight distribution of each of its restrictions and translations is $\epsilon$-close to the binomial distribution. We study the notion of pseudobinomiality and we conclude that, for spaces with $n^{-\Theta(1)}$-small bias, the pseudobinomiality error in the $L_1$-sense is equivalent to that in the entropy-difference-sense, in the $n^{-\Theta(1)}$-error regimen. We also study the notion of average case pseudobinomiality, and we show that for spaces with $n^{-\Theta(1)}$-small bias, the average entropy of the weight distribution of a random translation of the space is $n^{-\Theta(1)}$-close to the entropy of the binomial distribution. We discuss resulting questions on the pseudobinomiality of sums of independent small-bias spaces. Using the above results, we show that the following conjectures are equivalent: (1) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the ${\mathbf F}_2$-sum $X+Y$ is $O((n\delta)^{\Theta(1)})$-pseudobinomial; (2) For all independent $\delta$-biased random vectors $X,Y\in \{0,1\}^n$, the entropy of the weight of the sum $H(|X+Y|)\geq \min\{H(|X|),H(|Y|)\} - O((n\delta)^{\Theta(1)}).$


Submitted August 26, 2014, revised February 14, 2015, and in final form June 2, 2015 published July 10, 2015.

DOI: 10.4086/cjtcs.2015.002


[] Article 1 [] Article 3
[back] Volume 2015 [back] Published articles
[CJCTS home]