Second-order Conditional Gradient Sliding [article]

Alejandro Carderera, Sebastian Pokutta
2023 arXiv   pre-print
Constrained second-order convex optimization algorithms are the method of choice when a high accuracy solution to a problem is needed, due to their local quadratic convergence. These algorithms require the solution of a constrained quadratic subproblem at every iteration. We present the Second-Order Conditional Gradient Sliding (SOCGS) algorithm, which uses a projection-free algorithm to solve the constrained quadratic subproblems inexactly. When the feasible region is a polytope the algorithm
more » ... onverges quadratically in primal gap after a finite number of linearly convergent iterations. Once in the quadratic regime the SOCGS algorithm requires 𝒪(log(log 1/ε)) first-order and Hessian oracle calls and 𝒪(log (1/ε) log(log1/ε)) linear minimization oracle calls to achieve an ε-optimal solution. This algorithm is useful when the feasible region can only be accessed efficiently through a linear optimization oracle, and computing first-order information of the function, although possible, is costly.
arXiv:2002.08907v3 fatcat:zz4qe7yhareuzlvzyhtjibfesa