Combinatorial Limitations of Average-Radius List-Decoding
Venkatesan Guruswami, Srivatsan Narayanan
2014
IEEE Transactions on Information Theory
We study certain combinatorial aspects of list-decoding, motivated by the exponential gap between the known upper bound (of O(1/γ)) and lower bound (of Ωp(log(1/γ))) for the list-size needed to list decode up to error fraction p with rate γ away from capacity, i.e., 1 − h(p) − γ (here p ∈ (0, 1 2 ) and γ > 0). Our main result is the following: We prove that in any binary code C ⊆ {0, 1} n of rate 1−h(p)−γ, there must exist a set L ⊂ C of Ωp(1/ √ γ) codewords such that the average distance of
more »
... points in L from their centroid is at most pn. In other words, there must exist Ωp(1/ √ γ) codewords with low "average radius." The standard notion of list-decoding corresponds to working with the maximum distance of a collection of codewords from a center instead of average distance. The average-radius form is in itself quite natural; for instance, the classical Johnson bound in fact implies average-radius listdecodability. The remaining results concern the standard notion of list-decoding, and help clarify the current state of affairs regarding combinatorial bounds for list-decoding: -We give a short simple proof, over all fixed alphabets, of the abovementioned Ωp(log(1/γ)) lower bound. Earlier, this bound followed from a complicated, more general result of Blinovsky. -We show that one cannot improve the Ωp(log(1/γ)) lower bound via techniques based on identifying the zero-rate regime for list-decoding of constant-weight codes. On a positive note, our Ωp(1/ √ γ) lower bound for average-radius list-decoding circumvents this barrier. -We exhibit a "reverse connection" between the existence of constantweight and general codes for list-decoding, showing that the best possible list-size, as a function of the gap γ of the rate to the capacity limit, is the same up to constant factors for both constant-weight codes (whose weight is bounded away from p) and general codes. -We give simple second moment based proofs that w.h.p. a list-size of Ωp(1/γ) is needed for list-decoding random codes from errors as well as erasures. For random linear codes, the corresponding list-size bounds are Ωp(1/γ) for errors and exp(Ωp(1/γ)) for erasures. Full version can be found at http://eccc.hpi-web.de/report/2012/017/ Research supported in part by NSF grant CCF 0953155 and a Packard Fellowship.
doi:10.1109/tit.2014.2343224
fatcat:dnzf3gw7qje7jdzg2kdt7yjn4y