A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
A Fast Anderson-Chebyshev Acceleration for Nonlinear Optimization
[article]
2020
arXiv
pre-print
Anderson acceleration (or Anderson mixing) is an efficient acceleration method for fixed point iterations x_t+1=G(x_t), e.g., gradient descent can be viewed as iteratively applying the operation G(x) x-α∇ f(x). It is known that Anderson acceleration is quite efficient in practice and can be viewed as an extension of Krylov subspace methods for nonlinear problems. In this paper, we show that Anderson acceleration with Chebyshev polynomial can achieve the optimal convergence rate O(√(κ)ln1/ϵ),
arXiv:1809.02341v4
fatcat:5kghvryk2re2rhiehbllsywt3y