A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Accelerated Proximal Alternating Gradient-Descent-Ascent for Nonconvex Minimax Machine Learning
[article]
2022
arXiv
pre-print
Alternating gradient-descent-ascent (AltGDA) is an optimization algorithm that has been widely used for model training in various machine learning applications, which aims to solve a nonconvex minimax optimization problem. However, the existing studies show that it suffers from a high computation complexity in nonconvex minimax optimization. In this paper, we develop a single-loop and fast AltGDA-type algorithm that leverages proximal gradient updates and momentum acceleration to solve
arXiv:2112.11663v7
fatcat:bgkeeeofijadzajcum44pbadsa