Filters








234 Hits in 7.4 sec

Iteratively Reweighted Least Squares Algorithms for L1-Norm Principal Component Analysis

Young Woong Park, Diego Klabjan
2016 2016 IEEE 16th International Conference on Data Mining (ICDM)  
For the L1 PCA problem minimizing the fitting error of the reconstructed data, we propose an exact reweighted and an approximate algorithm based on iteratively reweighted least squares.  ...  Principal component analysis (PCA) is often used to reduce the dimension of data by selecting a few orthonormal vectors that explain most of the variance structure of the data.  ...  For P1 with the L 1 norm, the first proposed algorithm, the exact reweighted version, is based on iteratively reweighted least squares (IRLS) that gives a weight to each observation.  ... 
doi:10.1109/icdm.2016.0054 dblp:conf/icdm/ParkK16 fatcat:my5g65k5hzd5zgl76lf5j2qhae

A note on privacy preserving iteratively reweighted least squares [article]

Mijung Park, Max Welling
2016 arXiv   pre-print
Iteratively reweighted least squares (IRLS) is a widely-used method in machine learning to estimate the parameters in the generalised linear models.  ...  In particular, IRLS for L1 minimisation under the linear model provides a closed-form solution in each step, which is a simple multiplication between the inverse of the weighted second moment matrix and  ...  Here we set p = 1 and compute L1 norm constrained least squares.  ... 
arXiv:1605.07511v1 fatcat:bm432opfwbdadc25kwnaqeetse

Minimum Error Entropy Algorithms with Sparsity Penalty Constraints

Zongze Wu, Siyuan Peng, Wentao Ma, Badong Chen, Jose Principe
2015 Entropy  
To address this issue, we incorporate in this work an l1-norm or a reweighted l1-norm into the minimum error entropy (MEE) criterion to develop new sparse adaptive filters, which may perform much better  ...  Simulation results confirm the superior performance of the new algorithms.  ...  Principe polished the language and was in charge of technical checking. All authors have read and approved the final manuscript. Conflicts of Interest The authors declare no conflict of interest.  ... 
doi:10.3390/e17053419 fatcat:5lq42sh56rgaphqodhju6ic4hm

Recovery of corrupted low-rank matrices via half-quadratic based nonconvex minimization

Ran He, Zhenan Sun, Tieniu Tan, Wei-Shi Zheng
2011 CVPR 2011  
The methods used involve minimizing a combination of nuclear norm and l 1 norm.  ...  Recovering arbitrarily corrupted low-rank matrices arises in computer vision applications, including bioinformatic data analysis and visual tracking.  ...  Iteratively reweighted least squares (IRLS) approaches are used to solve the l pnorm.  ... 
doi:10.1109/cvpr.2011.5995328 dblp:conf/cvpr/HeSTZ11 fatcat:mrzk3n422reqjgozlujesbcyc4

Iteratively Reweighted Sparse Coding for Visual Tracking

Hongli Yan, Qibin Lin, Yufeng Wang
2016 Innovative Computing Information and Control Express Letters, Part B: Applications  
Moreover, we also propose an iterative numerical method to minimize the object representation function effectively.  ...  Tracking (DSST) [5] , L2-regularized Least Square (L2-RLS) [9] , and Least Soft-threshold Squares Tracking (LSST) [10] . 4.1.  ...  The whole iterative method is summarized in Algorithm 1.  ... 
doi:10.24507/icicelb.07.12.2619 fatcat:5zstcanqefgq7cglhmushaalfi

Piece-wise quadratic approximations of arbitrary error functions for fast and robust machine learning

A.N. Gorban, E.M. Mirkes, A. Zinovyev
2016 Neural Networks  
In this paper, we develop a theory and basic universal data approximation algorithms (k-means, principal components, principal manifolds and graphs, regularized and sparse regression), based on piece-wise  ...  The back side of these approaches is increase in computational cost for optimization.  ...  Aknowledgement This study was supported in part by Big Data Paris Science et Lettre Research University project 'PSL Institute for Data Science'.  ... 
doi:10.1016/j.neunet.2016.08.007 pmid:27639721 fatcat:wsm656r2jzc7bc5kmfalrnzvzu

Lunar magnetic field models from Lunar Prospector and SELENE/Kaguya along‐track magnetic field gradients

D. Ravat, M. E. Purucker, N. Olsen
2020 Journal of Geophysical Research - Planets  
We thank Kimberly Moore for discussions related to elastic net based sparse models.  ...  We thank Ian Garrick-Bethell, an anonymous reviewer, and editors for their meticulous reviews.  ...  We use the scheme of iteratively reweighted least squares to account for non-Gaussian data errors.  ... 
doi:10.1029/2019je006187 fatcat:w4wtpki2grdg3fs5vibxyxaqym

Decomposition into low-rank plus additive matrices for background/foreground separation: A review for a comparative evaluation with a large-scale dataset

Thierry Bouwmans, Andrews Sobral, Sajid Javed, Soon Ki Jung, El-Hadi Zahzah
2017 Computer Science Review  
The most representative problem formulation is the Robust Principal Component Analysis (RPCA) solved via Principal Component Pursuit (PCP) which decomposes a data matrix in a low-rank matrix and a sparse  ...  testing and ranking existing algorithms for background/foreground separation.  ...  thank the following researchers: Zhouchen Lin (Visual Computing Group, Microsoft Research Asia) who has kindly provided the solver LADMAP [192] and the l 1 -filtering [196] , Shiqian Ma (Institute for  ... 
doi:10.1016/j.cosrev.2016.11.001 fatcat:vdh7ic4n6zfkjlccnyiq74z5wu

Two proposals for robust PCA using semidefinite programming

Michael McCoy, Joel A. Tropp
2011 Electronic Journal of Statistics  
The performance of principal component analysis (PCA) suffers badly in the presence of outliers. This paper proposes two novel approaches for robust PCA based on semidefinite programming.  ...  This paper also presents efficient computational methods for solving these SDPs. Numerical experiments confirm the value of these new techniques.  ...  Acknowledgments The authors would like to thank the anonymous referees for their thoughtful suggestions, as well as Alex Gittens, Richard Chen, and Stephen Becker for valuable discussions regarding this  ... 
doi:10.1214/11-ejs636 fatcat:eovfpae4lndpxe7an3z2w6ihie

Online and Batch Supervised Background Estimation via L1 Regression [article]

Aritra Dutta, Peter Richtarik
2017 arXiv   pre-print
As existing methods for ℓ_1 regression do not scale to high-resolution videos, we propose several simple and scalable methods for solving the problem, including iteratively reweighted least squares, a  ...  We propose a surprisingly simple model for supervised video background estimation. Our model is based on ℓ_1 regression.  ...  Additionally, our online methods are fast as we perform neither conventional nor incremental principal component analysis (PCA).  ... 
arXiv:1712.02249v1 fatcat:gxhhyu4ld5f6jfhszclm6bevvm

Cell Membrane Tracking in Living Brain Tissue Using Differential Interference Contrast Microscopy

John Lee, Ilya Kolb, Craig R. Forest, Christopher J. Rozell
2018 IEEE Transactions on Image Processing  
This simulator allows us to better understand the image statistics (to improve algorithms), as well as quantitatively test cell segmentation and tracking algorithms in scenarios where ground truth data  ...  in the OPL image using ℓ 1 -norm and total variation (TV) norm (i.e., ‖·‖ TV ) regularizers: (2) where β, γ are sparsity and smoothness parameters that control the weight of pixel sparsity and edge-sparsity  ...  by ℓ 1 and TV (L1+TV) [20] , least-square regularized by ℓ 1 and Laplacian Tikhonov (L1+Tik) [20] , and least-square by regularized re-weighted ℓ 1 , weighted Laplacian Tikhonov, and weighted dynamic  ... 
doi:10.1109/tip.2017.2787625 pmid:29346099 pmcid:PMC5839128 fatcat:lfj5xhcvdndrxi5rgreikyqsdi

Online and Batch Supervised Background Estimation Via L1 Regression

Aritra Dutta, Peter Richtarik
2019 2019 IEEE Winter Conference on Applications of Computer Vision (WACV)  
As existing methods for L1 regression do not scale to high-resolution videos, we propose several simple, fast, and scalable methods including iteratively reweighted least squares, a homotopy method, and  ...  Our model is based on L1 regression.  ...  ( 9 )Figure 3 : 93 via four algorithms: (a) iteratively reweighted least squares (IRLS), (b) homotopy method, (c) stochastic subgradient descent (variant 1), and (d) stochastic subgradient Comparison  ... 
doi:10.1109/wacv.2019.00063 dblp:conf/wacv/DuttaR19 fatcat:d3qyxlg64nhzthgqcqfaiv5ys4

Learning Robust Locality Preserving Projection via p-Order Minimization

Hua Wang, Feiping Nie, Heng Huang
2015 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
As an important theoretical contribution of this work, we systematically derive an efficient iterative algorithm to solve the general p-th order L2-norm minimization problem, which, to the best of our  ...  the p-th order of the L2-norm distances, which can better tolerate large outlying data samples because it suppress the introduced biased more than the L1-norm or not squared L2-norm minimizations.  ...  Thus we compare it against the following unsupervised dimensionality reduction methods: (1) principal component analysis (PCA) (Jolliffe 2005) , (2) robust principal component analysis (rPCA) (Wright  ... 
doi:10.1609/aaai.v29i1.9632 fatcat:vyqj7uf5fvdtrnupwsc2e3mnby

Iteratively Reweighted Least Squares Algorithm for Sparse Principal Component Analysis with Application to Voting Records

Tomáš Masák
2017 Statistika: Statistics and Economy Journal  
We show that the algorithm basically attempts to fnd a solution to a penalized least squares problem with a non-convex penalty that resembles the l0-norm more closely.  ...  Principal component analysis (PCA) is a popular dimensionality reduction and data visualization method. Sparse PCA (SPCA) is its extensively studied and NP-hard-to-solve modifcation.  ...  least squares (IRLS) algorithm.  ... 
doaj:77688b5e412c4fb8a0e4b7f2c82d1563 fatcat:exkxmu7krfahdjrwop3lkxfd7u

Motion Estimation via Robust Decomposition with Constrained Rank [article]

German Ros and Jose Alvarez and Julio Guerrero
2014 arXiv   pre-print
We provide evidences showing that even when it is not possible to recover an uncorrupted low-rank matrix, the resulting information can be exploited for outlier detection.  ...  In this work, we address the problem of outlier detection for robust motion estimation by using modern sparse-low-rank decompositions, i.e., Robust PCA-like methods, to impose global rank constraints.  ...  Another example is the Iteratively Reweighted Least Squares (IRLS) technique [15] [16] .  ... 
arXiv:1410.6126v1 fatcat:2kscfff3ifc7jbzdavvqiqrgjm
« Previous Showing results 1 — 15 out of 234 results