Bridging between 0/1 and Linear Programming via Random Walks [article]

Joshua Brakensiek, Venkatesan Guruswami
2019 arXiv   pre-print
Under the Strong Exponential Time Hypothesis, an integer linear program with n Boolean-valued variables and m equations cannot be solved in c^n time for any constant c < 2. If the domain of the variables is relaxed to [0,1], the associated linear program can of course be solved in polynomial time. In this work, we give a natural algorithmic bridging between these extremes of 0-1 and linear programming. Specifically, for any subset (finite union of intervals) E ⊂ [0,1] containing {0,1}, we give
more » ... random-walk based algorithm with runtime O_E((2-measure(E))^npoly(n,m)) that finds a solution in E^n to any n-variable linear program with m constraints that is feasible over {0,1}^n. Note that as E expands from {0,1} to [0,1], the runtime improves smoothly from 2^n to polynomial. Taking E = [0,1/k) ∪ (1-1/k,1] in our result yields as a corollary a randomized (2-2/k)^npoly(n) time algorithm for k-SAT. While our approach has some high level resemblance to Schöning's beautiful algorithm, our general algorithm is based on a more sophisticated random walk that incorporates several new ingredients, such as a multiplicative potential to measure progress, a judicious choice of starting distribution, and a time varying distribution for the evolution of the random walk that is itself computed via an LP at each step (a solution to which is guaranteed based on the minimax theorem). Plugging the LP algorithm into our earlier polymorphic framework yields fast exponential algorithms for any CSP (like k-SAT, 1-in-3-SAT, NAE k-SAT) that admit so-called 'threshold partial polymorphisms.'
arXiv:1904.04860v1 fatcat:mru5bc6lhnh5tjn2sfyxoyhuqe