INFORMATION THEORY METHODS IN COMMUNICATION COMPLEXITY
Nikolaos Leonardos
unpublished
OF THE DISSERTATION Information theory methods in communication complexity by Nikolaos Leonardos Dissertation Director: Michael Saks This dissertation is concerned with the application of notions and methods from the field of information theory to the field of communication complexity. It consists of two main parts. In the first part of the dissertation, we prove lower bounds on the random-ized two-party communication complexity of functions that arise from read-once boolean formulae. A
more »
... e boolean formula is a formula in propositional logic with the property that every variable appears exactly once. Such a formula can be represented by a tree, where the leaves correspond to variables, and the internal nodes are labeled by binary connectives. Under certain assumptions, this representation is unique. Thus, one can define the depth of a formula as the depth of the tree that represents it. The complexity of the evaluation of general read-once formulae has attracted interest mainly in the decision tree model. In the communication complexity model many interesting results deal with specific read-once formulae, such as disjointness and tribes. In this dissertation we use information theory methods to prove lower bounds that hold for any read-once ii formula. Our lower bounds are of the form n(f)/c d(f) , where n(f) is the number of variables and d(f) is the depth of the formula, and they are optimal up to the constant in the base of the denominator. In the second part of the dissertation, we explore the applicability of the information-theoretic method in the number-on-the-forehead model. The work of Bar-Yossef, Jayram, Kumar & Sivakumar [BYJKS04] revealed a beautiful connection between Hellinger distance and two-party randomized communication protocols. Inspired by their work and motivated by the open questions in the number-on-the-forehead model, we introduce the notion of Hellinger volume. We show that it lower bounds the information cost of multi-party protocols. We provide a small toolbox that allows one to manipulate several Hellinger volume terms and also to lower bound a Hellinger volume when the distributions involved satisfy certain conditions. In doing so, we prove a new upper bound on the difference between the arithmetic mean and the geometric mean in terms of relative entropy. Finally, we show how to apply the new tools to obtain a lower bound on the informational complexity of the AND k function. iii Acknowledgements First, I would like to thank my advisor, Michael Saks. Mike has been teaching me all these years how to do research and how to think about a particular problem. By observing him teach, I learned to value intuition and try hard to reveal it when I'm presenting a mathematical proof. I'm glad he was my advisor.
fatcat:ucgvznlea5bhhbgz3tf56di7lq