A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is
We propose a new stochastic variance-reduced damped L-BFGS algorithm, where we leverage estimates of bounds on the largest and smallest eigenvalues of the Hessian approximation to balance its quality and conditioning. Our algorithm, VARCHEN, draws from previous work that proposed a novel stochastic damped L-BFGS algorithm called SdLBFGS. We establish almost sure convergence to a stationary point and a complexity bound. We empirically demonstrate that VARCHEN is more robust than SdLBFGS-VR andarXiv:2012.05783v1 fatcat:nwjtqfjjnbeqjcejbldgaul56i
more »... RG on a modified DavidNet problem -- a highly nonconvex and ill-conditioned problem that arises in the context of deep learning, and their performance is comparable on a logistic regression problem and a nonconvex support-vector machine problem.
Acknowlegdement We thank Tiphaine Bonniot de Ruisselet from ENSEEIHT Toulouse (France) for performing some preliminary numerical experiments. ...arXiv:2002.00617v1 fatcat:6rtowkbwtfcyrle6a3m3d6pfea