A fast algorithm for updating and downsizing the dominant kernel principal components

Abstract
Many important kernel methods in the machine learning area, such as kernel principal component analysis, feature approximation, denoising, compression and prediction require the computation of the dominant set of eigenvectors of the symmetric kernel Gram matrix. Recently, an efficient incremental approach was presented for the fast calculation of the dominant kernel eigenbasis. In this manuscript we propose faster algorithms for incrementally updating and downsizing the dominant kernel eigenbasis. These methods are well-suited for large scale problems since they are both efficient in terms of complexity and data management.
Anno
2010
Tipo pubblicazione
Altri Autori
Mastronardi N., Tyrtyshnikov E., Van Dooren P.
Editore
Society for Industrial and Applied Mathematics ,
Rivista
SIAM journal on matrix analysis and applications (Print)