A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Fast Matrix Square Roots with Applications to Gaussian Processes and Bayesian Optimization
[article]
2020
arXiv
pre-print
Matrix square roots and their inverses arise frequently in machine learning, e.g., when sampling from high-dimensional Gaussians 𝒩(0, 𝐊) or whitening a vector 𝐛 against covariance matrix 𝐊. While existing methods typically require O(N^3) computation, we introduce a highly-efficient quadratic-time algorithm for computing 𝐊^1/2𝐛, 𝐊^-1/2𝐛, and their derivatives through matrix-vector multiplication (MVMs). Our method combines Krylov subspace methods with a rational approximation and typically
arXiv:2006.11267v2
fatcat:twbpjfca2nc5npww4q442itdoq