Filters








4,928 Hits in 4.7 sec

Reconstruction of a Generalized Joint Sparsity Model using Principal Component Analysis

Alireza Makhzani, Shahrokh Valaee
2011 2011 4th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)  
In this paper, we define a new Joint Sparsity Model (JSM) and use Principal Component Analysis followed by Minimum Description Length and Compressive Sensing to reconstruct spatially and temporally correlated  ...  We use the fact that the common component generates a common subspace that can be found using the principal component analysis and the minimum description length.  ...  In this work, motivated by the signal models proposed in [5] , we propose a more general model called Generalized Joint Sparsity Model (G-JSM) that can model more practical cases.  ... 
doi:10.1109/camsap.2011.6136001 dblp:conf/camsap/MakhzaniV11 fatcat:vg6rf722xjgcxahqacf6jbwns4

Learning Theory and Approximation

Kurt Jetter, Steve Smale, Ding-Xuan Zhou
2012 Oberwolfach Reports  
This workshop -the second one of this type at the MFO -has concentrated on the following recent topics: Learning of manifolds and the geometry of data; sparsity and dimension reduction; error analysis  ...  and algorithmic aspects, including kernel based methods for regression and classification; application of multiscale aspects and of refinement algorithms to learning. (2000) : 68Q32, 41A35, 41A63, 62Jxx  ...  Lim gave a talk on principal components of cumulants, discussing the geometry underlying cumulants and examining two ways to their principal components analysis, decomposing a homogeneous form into a linear  ... 
doi:10.4171/owr/2012/31 fatcat:6obnt34cizfvrmpsdlr4b4vnva

Learning Theory and Approximation

Kurt Jetter, Steve Smale, Ding-Xuan Zhou
2008 Oberwolfach Reports  
This workshop -the second one of this type at the MFO -has concentrated on the following recent topics: Learning of manifolds and the geometry of data; sparsity and dimension reduction; error analysis  ...  and algorithmic aspects, including kernel based methods for regression and classification; application of multiscale aspects and of refinement algorithms to learning. (2000) : 68Q32, 41A35, 41A63, 62Jxx  ...  Lim gave a talk on principal components of cumulants, discussing the geometry underlying cumulants and examining two ways to their principal components analysis, decomposing a homogeneous form into a linear  ... 
doi:10.4171/owr/2008/30 fatcat:wxkujx44g5aebor4ojsv4dbjbi

Detecting Burnscar from Hyperspectral Imagery via Sparse Representation with Low-Rank Interference [article]

Minh Dao, Xiang Xiang, Bulent Ayhan, Chiman Kwan, Trac D. Tran
2016 arXiv   pre-print
burnscar area in the low-rank component output of the first step.  ...  In this paper, we propose a burnscar detection model for hyperspectral imaging (HSI) data.  ...  the cloud and then a spatial-temporal joint-sparsity model is conducted to detect the burnscar target using inputs as the low-rank components of the RPCA step.  ... 
arXiv:1605.00287v2 fatcat:yn3qcv3t3res3jcg5bwpb3agcy

Accelerated MR parameter mapping with low-rank and sparsity constraints

Bo Zhao, Wenmiao Lu, T. Kevin Hitchens, Fan Lam, Chien Ho, Zhi-Pei Liang
2014 Magnetic Resonance in Medicine  
Additionally, the proposed method was compared with two state-of-the-art methods that only use a single low-rank or joint sparsity constraint.  ...  Theory and Methods: A new constrained reconstruction method based on low-rank and sparsity constraints is proposed to accelerate MR parameter mapping.  ...  Zhao would like to thank Bryan Clifford for helping with the skull removal of the human brain data set.  ... 
doi:10.1002/mrm.25421 pmid:25163720 pmcid:PMC4344441 fatcat:crwpd5efojckrfoki5tiuygmui

Structured Sparsity through Convex Optimization

Francis Bach, Rodolphe Jenatton, Julien Mairal, Guillaume Obozinski
2012 Statistical Science  
(Amit et al., 2007; Harchaoui et al., 2012) Sparse principal component analysis • Given a matrix M ∈ R n×p -Rank of M is the minimum size m of all factorizations of M into M = UV ⊤ , U ∈ R n×m and V ∈  ...  are equivalent Sparse principal component analysis • Given data X = (x ⊤ 1 , . . . , x ⊤ n ) ∈ R p×n , two views of PCA: -Analysis view: find the projection d ∈ R p of maximum variance (with deflation  ... 
doi:10.1214/12-sts394 fatcat:vo3ema7cczhs7jexykylah5qv4

Eigenvectors from Eigenvalues Sparse Principal Component Analysis (EESPCA) [article]

H. Robert Frost
2021 arXiv   pre-print
We present a novel technique for sparse principal component analysis.  ...  We explore two versions of the EESPCA method: a version that uses a fixed threshold for inducing sparsity and a version that selects the threshold via cross-validation.  ...  Acknowledgments This work was funded by National Institutes of Health grants K01LM012426, R21CA253408, P20GM130454 and P30CA023108.  ... 
arXiv:2006.01924v3 fatcat:qikjgxi4cfesxo3awmcz4l54xy

Split Bregman algorithms for sparse / joint-sparse and low-rank signal recovery: Application in compressive hyperspectral imaging

A. Gogna, A. Shukla, H. K. Agarwal, A. Majumdar
2014 2014 IEEE International Conference on Image Processing (ICIP)  
-norm (joint sparsity) and nuclear norm regularized least squares problem.  ...  We show that our proposed techniques significantly outperform previous methods in terms of recovery accuracy.  ...  ; the problem is popularly called the Robust Principal Component Analysis (RPCA) [5, 6] .  ... 
doi:10.1109/icip.2014.7025260 dblp:conf/icip/GognaSAM14 fatcat:4y6lp35qtrdcpcxw5ycvcffcxq

Bayesian principal geodesic analysis for estimating intrinsic diffeomorphic image variability

Miaomiao Zhang, P. Thomas Fletcher
2015 Medical Image Analysis  
A sparsity prior in the model results in automatic selection of the number of relevant dimensions by driving unnecessary principal geodesics to zero.  ...  We develop a latent variable model for principal geodesic analysis (PGA) that provides a probabilistic framework for factor analysis in the space of diffeomorphisms.  ...  Conclusion and Future Work We presented a generative Bayesian model of principal geodesic analysis in diffeomorphic image registration.  ... 
doi:10.1016/j.media.2015.04.009 pmid:26159890 fatcat:wsmpmr7qzjgyricdgxrrwssree

Sparse Bayesian Methods for Low-Rank Matrix Estimation

S. Derin Babacan, Martin Luessi, Rafael Molina, Aggelos K. Katsaggelos
2012 IEEE Transactions on Signal Processing  
In this paper, we present novel recovery algorithms for estimating low-rank matrices in matrix completion and robust principal component analysis based on sparse Bayesian learning (SBL) principles.  ...  Starting from a matrix factorization formulation and enforcing the low-rank constraint in the estimates as a sparsity constraint, we develop an approach that is very effective in determining the correct  ...  Widely used classical methods, such as principal component analysis (PCA), often fail to provide meaningful results in these cases.  ... 
doi:10.1109/tsp.2012.2197748 fatcat:jwymzrutojh7xissaxmrbuokg4

Sparse Bayesian Methods for Low-Rank Matrix Estimation [article]

S. Derin Babacan, Martin Luessi, Rafael Molina, Aggelos K. Katsaggelos
2011 arXiv   pre-print
In this paper, we present novel recovery algorithms for estimating low-rank matrices in matrix completion and robust principal component analysis based on sparse Bayesian learning (SBL) principles.  ...  A number of methods have been developed for this recovery problem. However, a principled method for choosing the unknown target rank is generally not provided.  ...  Widely used classical methods, such as principal component analysis (PCA), often fail to provide meaningful results in these cases.  ... 
arXiv:1102.5288v2 fatcat:qfkyg67ugvfabnibiw55l64qai

Compressed Sensing of Simultaneous Low-Rank and Joint-Sparse Matrices [article]

Mohammad Golbabaee, Pierre Vandergheynst
2012 arXiv   pre-print
(e.g. sensor networks, hyperspectral imaging) and compressive sparse principal component analysis (s-PCA).  ...  We introduce a new model that can efficiently restrict the degrees of freedom of the problem and is generic enough to find a lot of applications, for instance in multichannel signal compressed sensing  ...  a sparse matrix of the corresponding loadings of the principal components.  ... 
arXiv:1211.5058v1 fatcat:rtplplgunrfrva64stevqzp2oq

Information-Theoretic Characterization and Undersampling Ratio Determination for Compressive Radar Imaging in a Simulated Environment

Jingxiong Zhang, Ke Yang, Fengzhu Liu, Ying Zhang
2015 Entropy  
Firstly, the assumption of signal sparsity or compressibility in CS often leads to the use of simplified and non-realistic signal models (such as Bernoulli and spike sequences [31]) for computing and analysis  ...  The novelty of this paper lies in furnishing a general strategy for information-theoretic analysis of scene compressibility, trans-information of radar echo data about the scene and the targets of interest  ...  Author Contributions Jingxiong Zhang, the principal author, contributed mostly to the literature survey in the fields of information theory, radar and CS.  ... 
doi:10.3390/e17085171 fatcat:d3hb6x23onfahhgh7ur5cuzfhi

Learning Compressive Sensing Models for Big Spatio-Temporal Data [chapter]

Dongeun Lee, Jaesik Choi
2015 Proceedings of the 2015 SIAM International Conference on Data Mining  
because a real-world signal in general cannot be represented with the fixed number of components.  ...  CS reconstructs the compressed signals exactly with overwhelming probability when incoming data can be sparsely represented with a fixed number of components, which is one of drawbacks of CS frameworks  ...  Model-Based CS is built on a joint sparsity model in order to exploit the joint correlation inherent in the spatio-temporal dimension.  ... 
doi:10.1137/1.9781611974010.75 dblp:conf/sdm/LeeC15 fatcat:jocwnuwppbe5lal7nriv5wc22e

Sparsity control for robust principal component analysis

Gonzalo Mateos, Georgios B. Giannakis
2010 2010 Conference Record of the Forty Fourth Asilomar Conference on Signals, Systems and Computers  
A least-trimmed squares estimator of a low-rank component analysis model is shown closely related to that obtained from an 0-(pseudo)normregularized criterion encouraging sparsity in a matrix explicitly  ...  Principal component analysis (PCA) is widely used for high-dimensional data analysis, with well-documented applications in computer vision, preference measurement, and bioinformatics.  ...  One approach to solving this problem, is to adopt a low-rank (component analysis) model x n = m + Us n + e n , n = 1, . . . , N where m ∈ R p is a location (mean) vector; matrix U ∈ R p×q has orthonormal  ... 
doi:10.1109/acssc.2010.5757875 fatcat:pmioyzyvdnfcfmpk4o2yotbwlu
« Previous Showing results 1 — 15 out of 4,928 results