A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Riemannian Stochastic Recursive Momentum Method for non-Convex Optimization
2021
Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence
unpublished
We propose a stochastic recursive momentum method for Riemannian non-convex optimization that achieves a nearly-optimal complexity to find epsilon-approximate solution with one sample. The new algorithm requires one-sample gradient evaluations per iteration and does not require restarting with a large batch gradient, which is commonly used to obtain a faster rate. Extensive experiment results demonstrate the superiority of the proposed algorithm. Extensions to nonsmooth and constrained optimization settings are also discussed.
doi:10.24963/ijcai.2021/345
fatcat:3oon3fbrqfe27a4pz6iruvr4am