The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions

Sergey Bobkov, Mokshay Madiman
2011 IEEE Transactions on Information Theory  
The entropy per coordinate in a log-concave random vector of any dimension with given density at the mode is shown to have a range of just 1. Uniform distributions on convex bodies are at the lower end of this range, the distribution with i.i.d. exponentially distributed coordinates is at the upper end, and the normal is exactly in the middle. Thus, in terms of the amount of randomness as measured by entropy per coordinate, any log-concave random vector of any dimension contains randomness that
more » ... ins randomness that differs from that in the normal random variable with the same maximal density value by at most 1/2. As applications, we obtain an information-theoretic formulation of the famous hyperplane conjecture in convex geometry, entropy bounds for certain infinitely divisible distributions, and quantitative estimates for the behavior of the density at the mode on convolution. More generally, one may consider so-called convex or hyperbolic probability measures on Euclidean spaces; we give new constraints on entropy per coordinate for this class of measures, which generalize our results under the log-concavity assumption, expose the extremal role of multivariate Pareto-type distributions, and give some applications. Index Terms-Convex measures, inequalities, log-concave, maximum entropy, slicing problem.
doi:10.1109/tit.2011.2158475 fatcat:7jxfcxxma5dapmxq32nev2k55y