Accelerating Stochastic Gradient Descent using Predictive Variance Reduction

Rie Johnson, Tong Zhang
2013 Neural Information Processing Systems  
Stochastic gradient descent is popular for large scale optimization but has slow convergence asymptotically due to the inherent variance. To remedy this problem, we introduce an explicit variance reduction method for stochastic gradient descent which we call stochastic variance reduced gradient (SVRG). For smooth and strongly convex functions, we prove that this method enjoys the same fast convergence rate as those of stochastic dual coordinate ascent (SDCA) and Stochastic Average Gradient
more » ... . However, our analysis is significantly simpler and more intuitive. Moreover, unlike SDCA or SAG, our method does not require the storage of gradients, and thus is more easily applicable to complex problems such as some structured prediction problems and neural network learning.
dblp:conf/nips/Johnson013 fatcat:ocmuty6ydbdcthtwmg3zftysvy