Online Vector Balancing and Geometric Discrepancy [article]

Nikhil Bansal, Haotian Jiang, Sahil Singla, Makrand Sinha
2020 arXiv   pre-print
We consider an online vector balancing question where T vectors, chosen from an arbitrary distribution over [-1,1]^n, arrive one-by-one and must be immediately given a ± sign. The goal is to keep the discrepancy small as possible. A concrete example is the online interval discrepancy problem where T points are sampled uniformly in [0,1], and the goal is to immediately color them ± such that every sub-interval remains nearly balanced. As random coloring incurs Ω(T^1/2) discrepancy, while the
more » ... ine bounds are Θ(√(n log (T/n))) for vector balancing and 1 for interval balancing, a natural question is whether one can (nearly) match the offline bounds in the online setting for these problems. One must utilize the stochasticity as in the worst-case scenario it is known that discrepancy is Ω(T^1/2) for any online algorithm. Bansal and Spencer recently show an O(√(n)log T) bound when each coordinate is independent. When there are dependencies among the coordinates, the problem becomes much more challenging, as evidenced by a recent work of Jiang, Kulkarni, and Singla that gives a non-trivial O(T^1/loglog T) bound for online interval discrepancy. Although this beats random coloring, it is still far from the offline bound. In this work, we introduce a new framework for online vector balancing when the input distribution has dependencies across coordinates. This lets us obtain a poly(n, log T) bound for online vector balancing under arbitrary input distributions, and a poly(log T) bound for online interval discrepancy. Our framework is powerful enough to capture other well-studied geometric discrepancy problems; e.g., a poly(log^d (T)) bound for the online d-dimensional Tusnády's problem. A key new technical ingredient is an anti-concentration inequality for sums of pairwise uncorrelated random variables.
arXiv:1912.03350v2 fatcat:f6qyelc2uzepdd63o3x4pp5mxm