A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2009; you can also visit the original URL.
The file type is application/pdf
.
Filters
A new way of PCA: integrated-squared-error and EM algorithms
2004 IEEE International Conference on Acoustics, Speech, and Signal Processing
We consider the properties of the integrated-squared-error in the framework of coupled generative model, giving efficient EM algorithms for integrated-squared-error minimization. ...
In this paper we introduce a new alternative error, so called, integrated-squared-error, the minimization of which determines the exact principal axes (without rotational ambiguity) of a set of observed ...
Our proposed EM algorithms, EM-ePCA and its limiting case, show a slightly different convergence behavior in terms of only squared error J q , compared to the EM-PCA algorithm. ...
doi:10.1109/icassp.2004.1327226
dblp:conf/icassp/AhnCO04
fatcat:7dvhac6p65c5fdra37r3mwdx7q
Simple Exponential Family PCA
2013
IEEE Transactions on Neural Networks and Learning Systems
Bayesian principal component analysis (BPCA), a probabilistic reformulation of PCA with Bayesian model selection, is a systematic approach to determining the number of essential principal components (PCs ...
In this paper, we propose simple exponential family PCA (SePCA), a generalised family of probabilistic principal component analysers. ...
Thus, the algorithms of (Csiszár and Tusnády, 1984) and EPCA, which are both deterministic, have a new interpretation as solving for the MAP estimation of a probabilistic model. ...
doi:10.1109/tnnls.2012.2234134
pmid:24808320
fatcat:fpzav35ebzhodlcnaks3j2dzui
Variational Bayesian functional PCA
2008
Computational Statistics & Data Analysis
Inference, including estimation, error assessment and model choice, particularly the choice of the number of eigenfunctions and their degree of smoothness, is derived from a variational approximation of ...
A Bayesian approach to analyze the modes of variation in a set of curves is suggested. It is based on a generative model thus allowing for noisy and sparse observations of curves. ...
Acknowledgement The author would like to thank two anonymous referees for valuable hints and comments which helped to considerably improve an earlier draft of this paper. ...
doi:10.1016/j.csda.2008.09.015
fatcat:q23tn3jlbbeltfuxerrc3qf7eq
Shapley Values of Reconstruction Errors of PCA for Explaining Anomaly Detection
[article]
2020
arXiv
pre-print
We present a method to compute the Shapley values of reconstruction errors of principal component analysis (PCA), which is particularly useful in explaining the results of anomaly detection based on PCA ...
We utilize the probabilistic view of PCA, particularly its conditional distribution, to exactly compute a value function for the Shapely values. ...
the EM algorithm, retraining may become inefficient for a moderate d ...
arXiv:1909.03495v2
fatcat:vufu3e7pvra3tdascuvgofsuwu
A Novel M-Estimator for Robust PCA
[article]
2014
arXiv
pre-print
When replacing the sum of terms in the convex energy function (that we minimize) with the sum of squares of terms, we obtain that the new minimizer is a scaled version of the inverse sample covariance ...
We compare our method with many other algorithms for robust PCA on synthetic and real data sets and demonstrate state-of-the-art speed and accuracy. ...
GL thanks Emmanuel Candès for inviting him to visit Stanford university in May 2010 and for his constructive criticism on the lack of a theoretically guaranteed algorithm for the l 1 subspace recovery ...
arXiv:1112.4863v4
fatcat:hpee6ljbtvenhgd65ikqormru4
Bayesian Variable Selection for Globally Sparse Probabilistic PCA
[article]
2016
arXiv
pre-print
To this end, using Roweis' probabilistic interpretation of PCA and a Gaussian prior on the loading matrix, we provide the first exact computation of the marginal likelihood of a Bayesian PCA model. ...
To avoid the drawbacks of discrete model selection, a simple relaxation of this framework is presented. It allows to find a path of models using a variational expectation-maximization algorithm. ...
., 1) we can prove (6) by invoking the independence between x v and xv, similarly to the proof of Theorem 1. ...
arXiv:1605.05918v2
fatcat:sckpgvdblnbpviisplroaxgmma
Learning Eigenfunctions Links Spectral Embedding and Kernel PCA
2004
Neural Computation
In this paper, we show a direct relation between spectral embedding methods and kernel PCA, and how both are special cases of a more general learning problem, that of learning the principal eigenfunctions ...
Experiments with LLE, Isomap, spectral clustering and MDS show that this out-of-sample embedding formula generalizes well, with a level of error comparable to the effect of small perturbations of the training ...
funding organizations: NSERC, MITACS, IRIS, and the Canada Research Chairs. ...
doi:10.1162/0899766041732396
pmid:15333211
fatcat:uzc4rzlorzfczdclqyeq7hohxy
Steerable ePCA: Rotationally Invariant Exponential Family PCA
[article]
2019
arXiv
pre-print
The second is steerable PCA, a fast and accurate procedure for including all planar rotations for PCA. ...
Our procedure, steerable ePCA, combines in a novel way two recently introduced innovations. ...
ZZ was partially supported by National Center for Supercomputing Applications Faculty Fellowship and University of Illinois at Urbana-Champaign College of Engineering Strategic Research Initiative. ...
arXiv:1812.08789v3
fatcat:ybqmxeemofbctcenuqajdsvlfq
Multiple Tracking of Moving Objects with Kalman Filtering and PCA-GMM Method
2013
Intelligent Information Management
The combined new method the PCA-GMM-KF attempts tracking multiple moving objects; the size and position of the objects along the sequence of their images in dynamic scenes. ...
In this article we propose to combine an integrated method, the PCA-GMM method that generates a relatively improved segmentation outcome as compared to conventional GMM with Kalman Filtering (KF). ...
Vachon surveyed the background modeling using mixture of Gaussians for object detection [5] . F. Zbu and K. Fujimura demonstrated a face tracking method using GMM and EM algorithm [6] . P. ...
doi:10.4236/iim.2013.52006
fatcat:xp6gqw4acbh5hi6dvfgdmjpycm
Bayesian variable selection for globally sparse probabilistic PCA
2018
Electronic Journal of Statistics
., 1) we can prove (6) by invoking the independence between x v and xv, similarly to the proof of Theorem 1. ...
Since zero is a constant, this convergence also happens to be in probability (Van der Vaart, 2000, p. 10). ...
First, as pointed out by Wipf and Nagarajan (2008) , convergences of EM algorithms are extremely slow in the case of the ARD models. ...
doi:10.1214/18-ejs1450
fatcat:imv2bdsolbfpba3vh2h3yj4oty
On Segmentation of Moving Objects by Integrating PCA Method with the Adaptive Background Model
2012
Journal of Signal and Information Processing
A modified algorithm for the adaptive background model is proposed by linking Gaussian mixture model with the method of principal component analysis PCA. ...
We report the update on both the parameters of the modified method and that of the Gaussian mixture model. The obtained results show the relatively outperform of the integrated method. ...
Sribivasa proposed a new image segmentation algorithm used the finite mixture of doubly truncated bivariate Gaussian distribution by integrating with the hierarchical clustering [16] . ...
doi:10.4236/jsip.2012.33051
fatcat:fye6h4y6bvclvbfdpgb5axuapy
A Pseudo-Bayesian Algorithm for Robust PCA
2016
Neural Information Processing Systems
Commonly used in many applications, robust PCA represents an algorithmic attempt to reduce the sensitivity of classical PCA to outliers. ...
The basic idea is to learn a decomposition of some data matrix of interest into low rank and sparse components, the latter representing unwanted outliers. ...
The basic idea is to marginalize out the unknown Z and E and solve max Ψ,Γ p(Y|Z, E)p(Z|Ψ)p(E|Γ)dZdE (11) using an EM-like algorithm. ...
dblp:conf/nips/OhMKW16
fatcat:wwy6xivek5gwnahi5g6wqcjdre
Scalable probabilistic PCA for large-scale genetic variation data
2020
PLoS Genetics
Principal component analysis (PCA) is a key tool for understanding population structure and controlling for population stratification in genome-wide association studies (GWAS). ...
With the advent of large-scale datasets of genetic variation, there is a need for methods that can compute principal components (PCs) with scalable computational and memory requirements. ...
Acknowledgments We would like to thank Bogdan Pasaniuc, members of his lab, and other members of the Sankararaman lab for advice and comments on this project. ...
doi:10.1371/journal.pgen.1008773
pmid:32469896
fatcat:luum2i7e3ze6hl4byjkgyqwrae
An ℓ_p theory of PCA and spectral clustering
[article]
2022
arXiv
pre-print
Principal Component Analysis (PCA) is a powerful tool in statistics and machine learning. ...
In this paper, we first develop an ℓ_p perturbation theory for a hollowed version of PCA in Hilbert spaces which provably improves upon the vanilla PCA in the presence of heteroscedastic noises. ...
KW was supported by a startup fund from Columbia University and the NIH grant 2R01-GM072611-15 when he was a student at Princeton University. ...
arXiv:2006.14062v3
fatcat:4vvd3gtl7vcnrj2x5at7fuxvxe
Pseudo-Bayesian Robust PCA: Algorithms and Analyses
[article]
2016
arXiv
pre-print
Commonly used in computer vision and other applications, robust PCA represents an algorithmic attempt to reduce the sensitivity of classical PCA to outliers. ...
The basic idea is to learn a decomposition of some data matrix of interest into low rank and sparse components, the latter representing unwanted outliers. ...
The basic idea is to marginalize out the unknown Z and E and solve max Ψ,Γ p(Y|Z, E)p(Z|Ψ)p(E|Γ)dZdE (11) using an EM-like algorithm. ...
arXiv:1512.02188v2
fatcat:i6lf7fg5qvgmpbqztywf7u65jy
« Previous
Showing results 1 — 15 out of 5,575 results