Out-of-Core GPU Gradient Boosting [article]

Rong Ou
2020 arXiv   pre-print
GPU-based algorithms have greatly accelerated many machine learning methods; however, GPU memory is typically smaller than main memory, limiting the size of training data. In this paper, we describe an out-of-core GPU gradient boosting algorithm implemented in the XGBoost library. We show that much larger datasets can fit on a given GPU, without degrading model accuracy or training time. To the best of our knowledge, this is the first out-of-core GPU implementation of gradient boosting. Similar
more » ... approaches can be applied to other machine learning algorithms
arXiv:2005.09148v1 fatcat:ulafsvkr7na3zctud75gukpxva