Filters








3 Hits in 1.1 sec

Stochastic Damped L-BFGS with Controlled Norm of the Hessian Approximation [article]

Sanae Lotfi and Tiphaine Bonniot de Ruisselet and Dominique Orban and Andrea Lodi
2020 arXiv   pre-print
We propose a new stochastic variance-reduced damped L-BFGS algorithm, where we leverage estimates of bounds on the largest and smallest eigenvalues of the Hessian approximation to balance its quality and conditioning. Our algorithm, VARCHEN, draws from previous work that proposed a novel stochastic damped L-BFGS algorithm called SdLBFGS. We establish almost sure convergence to a stationary point and a complexity bound. We empirically demonstrate that VARCHEN is more robust than SdLBFGS-VR and
more » ... RG on a modified DavidNet problem -- a highly nonconvex and ill-conditioned problem that arises in the context of deep learning, and their performance is comparable on a logistic regression problem and a nonconvex support-vector machine problem.
arXiv:2012.05783v1 fatcat:nwjtqfjjnbeqjcejbldgaul56i

Semi-active H_∞ damping optimization by adaptive interpolation [article]

Zoran Tomljanović, Matthias Voigt
2020 arXiv   pre-print
Acknowlegdement We thank Tiphaine Bonniot de Ruisselet from ENSEEIHT Toulouse (France) for performing some preliminary numerical experiments.  ... 
arXiv:2002.00617v1 fatcat:6rtowkbwtfcyrle6a3m3d6pfea

Semi‐active ℋ∞ damping optimization by adaptive interpolation [article]

Zoran Tomljanović, Matthias Voigt, Technische Universität Berlin
2020
ACKNOWLEGDEMENTS We thank Tiphaine Bonniot de Ruisselet from ENSEEIHT Toulouse (France) for performing some preliminary numerical experiments.  ... 
doi:10.14279/depositonce-10834 fatcat:35he47xl2nglbpn335k6di56rq