A Fast Algorithm for Updating and Downsizing the Dominant Kernel Principal Components

Nicola Mastronardi, Eugene E. Tyrtyshnikov, Paul Van Dooren
2010 SIAM Journal on Matrix Analysis and Applications  
Many important kernel methods in the machine learning area, such as kernel principal component analysis, feature approximation, denoising, compression, and prediction require the computation of the dominant set of eigenvectors of the symmetric kernel Gram matrix. Recently, an efficient incremental approach was presented for the fast calculation of the dominant kernel eigenbasis. In this paper we propose faster algorithms for incrementally updating and downsizing the dominant kernel eigenbasis.
more » ... kernel eigenbasis. These methods are well-suited for large scale problems since they are efficient in terms of both complexity and data management.
doi:10.1137/090774422 fatcat:o6q7kx5a75gs3kuhqpctuiksy4