Adaptive Second Order Coresets for Data-efficient Machine Learning

Omead Pooladzandi, David Davini, Baharan Mirzasoleiman
2022 International Conference on Machine Learning  
Training machine learning models on massive datasets incurs substantial computational costs. To alleviate such costs, there has been a sustained effort to develop data-efficient training methods that can carefully select subsets of the training examples that generalize on par with the full training data. However, existing methods are limited in providing theoretical guarantees for the quality of the models trained on the extracted subsets, and may perform poorly in practice. We propose ADACORE,
more » ... a method that leverages the geometry of the data to extract subsets of the training examples for efficient machine learning. The key idea behind our method is to dynamically approximate the curvature of the loss function via an exponentially-averaged estimate of the Hessian to select weighted subsets (coresets) that provide a close approximation of the full gradient preconditioned with the Hessian. We prove rigorous guarantees for the convergence of various first and second-order methods applied to the subsets chosen by ADACORE. Our extensive experiments show that ADACORE extracts coresets with higher quality compared to baselines and speeds up training of convex and non-convex machine learning models, such as logistic regression and neural networks, by over 2.9x over the full data and 4.5x over random subsets 1 .
dblp:conf/icml/PooladzandiDM22 fatcat:27h5y6pgwnfnzgmm4cujipl45y