On a Randomized Multi-Block ADMM for Solving Selected Machine Learning Problems [article]

Mingxi Zhu, Kresimir Mihic, Yinyu Ye
2020 arXiv   pre-print
The Alternating Direction Method of Multipliers (ADMM) has now days gained tremendous attentions for solving large-scale machine learning and signal processing problems due to the relative simplicity. However, the two-block structure of the classical ADMM still limits the size of the real problems being solved. When one forces a more-than-two-block structure by variable-splitting, the convergence speed slows down greatly as observed in practice. Recently, a randomly assembled cyclic multi-block
more » ... ADMM (RAC-MBADMM) was developed by the authors for solving general convex and nonconvex quadratic optimization problems where the number of blocks can go greater than two so that each sub-problem has a smaller size and can be solved much more efficiently. In this paper, we apply this method to solving few selected machine learning problems related to convex quadratic optimization, such as Linear Regression, LASSO, Elastic-Net, and SVM. We prove that the algorithm would converge in expectation linearly under the standard statistical data assumptions. We use our general-purpose solver to conduct multiple numerical tests, solving both synthetic and large-scale bench-mark problems. Our results show that RAC-MBADMM could significantly outperform, in both solution time and quality, other optimization algorithms/codes for solving these machine learning problems, and match up the performance of the best tailored methods such as Glmnet or LIBSVM. In certain problem regions RAC-MBADMM even achieves a superior performance than that of the tailored methods.
arXiv:1907.01995v2 fatcat:4n5xvj6v3zbs3c3dwg2rxbhc74