Scalable Computation of Regularized Precision Matrices via Stochastic Optimization [article]

Yves F. Atchadé, Rahul Mazumder, Jie Chen
2015 arXiv   pre-print
We consider the problem of computing a positive definite p × p inverse covariance matrix aka precision matrix θ=(θ_ij) which optimizes a regularized Gaussian maximum likelihood problem, with the elastic-net regularizer ∑_i,j=1^pλ (α|θ_ij| + 1/2(1- α) θ_ij^2), with regularization parameters α∈ [0,1] and λ>0. The associated convex semidefinite optimization problem is notoriously difficult to scale to large problems and has demanded significant attention over the past several years. We propose a
more » ... w algorithmic framework based on stochastic proximal optimization (on the primal problem) that can be used to obtain near optimal solutions with substantial computational savings over deterministic algorithms. A key challenge of our work stems from the fact that the optimization problem being investigated does not satisfy the usual assumptions required by stochastic gradient methods. Our proposal has (a) computational guarantees and (b) scales well to large problems, even if the solution is not too sparse; thereby, enhancing the scope of regularized maximum likelihood problems to many large-scale problems of contemporary interest. An important aspect of our proposal is to bypass the deterministic computation of a matrix inverse by drawing random samples from a suitable multivariate Gaussian distribution.
arXiv:1509.00426v1 fatcat:nwlyl2oiyzhsbmeyeit6bkopbm