Filters








200 Hits in 7.6 sec

Adaptive Feature Selection: Computationally Efficient Online Sparse Linear Regression under RIP [article]

Satyen Kale, Zohar Karnin, Tengyuan Liang, Dávid Pál
2017 arXiv   pre-print
Online sparse linear regression is an online problem where an algorithm repeatedly chooses a subset of coordinates to observe in an adversarially chosen feature vector, makes a real-valued prediction,  ...  The goal is to design an online learning algorithm with sublinear regret to the best sparse linear predictor in hindsight.  ...  Conclusions and Future Work In this paper, we gave computationally efficient algorithms for the online sparse linear regression problem under the assumption that the design matrices of the feature vectors  ... 
arXiv:1706.04690v1 fatcat:ugrzmhbqpjhcvi6agga2mrx76a

Hyperspectral Band Selection by Multitask Sparsity Pursuit

Yuan Yuan, Guokang Zhu, Qi Wang
2015 IEEE Transactions on Geoscience and Remote Sensing  
This paper focuses on groupwise band selection and proposes a new framework, including the following contributions: 1) a smart yet intrinsic descriptor for efficient band representation; 2) an evolutionary  ...  Index Terms-Band selection, compressive sensing (CS), hyperspectral image, immune clonal strategy (ICS), machine learning, multitask learning (MTL). 0196-2892  ...  In the future, we will develop computationally more efficient implementations by resorting to parallel computer architectures.  ... 
doi:10.1109/tgrs.2014.2326655 fatcat:vitxys6yh5fyxjjmvfera4dgk4

2020 Index IEEE Transactions on Signal Processing Vol. 68

2020 IEEE Transactions on Signal Processing  
., +, TSP 2020 2784-2798 Linear discriminant analysis Subspace Learning and Feature Selection via Orthogonal Mapping.  ...  Subspace Learning and Feature Selection via Orthogonal Mapping. Mandanas, F.D., +, TSP 2020 1034-1047 Unraveling the Veil of Subspace RIP Through Near-Isometry on Subspaces.  ... 
doi:10.1109/tsp.2021.3055469 fatcat:6uswtuxm5ba6zahdwh5atxhcsy

Table of Contents

2020 IEEE Transactions on Signal Processing  
Poor 1792 Robust and Computationally Efficient Digital IIR Filter Synthesis and Stability Analysis Under Finite Precision Implementations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .  ...  Wu 1021 Subspace Learning and Feature Selection via Orthogonal Mapping . . . . . . . . . . . . . F. D. Mandanas and C. L.  ... 
doi:10.1109/tsp.2020.3042287 fatcat:nh7viihaozhd7li3txtadnx5ui

RIPML: A Restricted Isometry Property based Approach to Multilabel Learning [article]

Akshay Soni, Yashar Mehdad
2017 arXiv   pre-print
The multilabel learning problem with large number of labels, features, and data-points has generated a tremendous interest recently.  ...  By the virtue of Restricted Isometry Property (RIP), satisfied by many random ensembles, we propose a novel procedure for multilabel learning known as RIPML.  ...  Since we are operating under the assumption that s L, the dimensionality reduction procedure adopted by us is efficient and fast.  ... 
arXiv:1702.05181v1 fatcat:tgplkckhgvazzpwlf6hbg4xqni

Intelligent Control of a Sensor-Actuator System via Kernelized Least-Squares Policy Iteration

Bo Liu, Sanfeng Chen, Shuai Li, Yongsheng Liang
2012 Sensors  
KLSPI introduce kernel trick into the LSPI framework for Reinforcement Learning, often achieving faster convergence and providing automatic feature selection via various kernel sparsification approaches  ...  We first show how Random Projections constitute an efficient sparsification technique and how our method often converges faster than regular LSPI, while at lower computational costs.  ...  In [12] , ALD is a kind of feature selection method, and can be considered as an online approximate algorithm of PCA, while at much reduced computational cost.  ... 
doi:10.3390/s120302632 pmid:22736969 pmcid:PMC3376585 fatcat:hbehozop5rbdhkpw2lkoaisbtq

An iterative hard thresholding estimator for low rank matrix recovery with explicit limiting distribution

Alexandra Carpentier, Arlene Kim
2018 Statistica sinica  
We propose a new estimator for the low rank matrix, based on the iterative hard thresholding method, and that is computationally efficient and simple.  ...  For instance, there have been substantial works under the sparsity assumption including sparse linear regression, sparse covariance matrices estimation or sparse inverse covariance matrices estimation  ...  As a complement in the online supplementary material (hereafter we call it online supplement), we adapt our method to the setting of sparse linear regressionm and provide an estimator that has an explicit  ... 
doi:10.5705/ss.202016.0103 fatcat:uhiw4kyrrfe2tb7wdny5wcovhe

Regret Lower Bound and Optimal Algorithm for High-Dimensional Contextual Linear Bandit [article]

Ke Li, Yun Yang, Naveen N. Narisetty
2021 arXiv   pre-print
Second, we propose a simple and computationally efficient algorithm inspired by the general Upper Confidence Bound (UCB) strategy that achieves a regret upper bound matching the lower bound.  ...  In this paper, we consider the multi-armed bandit problem with high-dimensional features.  ...  to the ∞ -norm (dual norm to the 1 -norm) of the corresponding feature, which is easy to implement and computationally efficient.  ... 
arXiv:2109.11612v1 fatcat:llvxblqnmbesjkyk3tjxmmlukm

Compressive sensing: From theory to applications, a survey

Saad Qaisar, Rana Muhammad Bilal, Wafa Iqbal, Muqaddas Naureen, Sungyoung Lee
2013 Journal of Communications and Networks  
Compressive sensing (CS) is a novel sampling paradigm that samples signals in a much more efficient way than the established Nyquist Sampling Theorem.  ...  Basis Pursuit [30] , Basis Pursuit De-Noising (BPDN) [30] , modified BPDN [31] , Least Absolute Shrinkage and Selection Operator (LASSO) [32] and Least Angle Regression (LARS) [33] are some examples  ...  After recasting the non-separable nuclear norm into a form amenable to online optimization, a real-time algorithm for dynamic anomalography is developed and its convergence established under simplifying  ... 
doi:10.1109/jcn.2013.000083 fatcat:mlxbvwfxivbubdbtzkqxgx5c3y

Sparsity-Aware Learning and Compressed Sensing: An Overview [article]

Sergios Theodoridis, Yannis Kopsinis, Konstantinos Slavakis
2012 arXiv   pre-print
This paper is based on a chapter of a new book on Machine Learning, by the first and third author, which is currently under preparation.  ...  Both batch processing and online processing techniques are considered. A case study in the context of time-frequency analysis of signals is also presented.  ...  Online Time-Adaptive Sparsity-Promoting Algorithms In this section, online (time-recursive) schemes for sparsity-aware learning are presented.  ... 
arXiv:1211.5231v1 fatcat:aeog5xwn4vghbbxjoklnanzl6y

Statistical Estimation: From Denoising to Sparse Regression and Hidden Cliques [article]

Eric W. Tramel and Santhosh Kumar and Andrei Giurgiu and Andrea Montanari
2014 arXiv   pre-print
Andrea Montanari on the topic of statistical estimation for linear models. The first two lectures cover the principles of signal recovery from linear measurements in terms of minimax risk.  ...  We will then cover more recent research, and discuss sparse linear regression in Section 4.3.1, and its analysis for random designs in Section 5.2.  ...  If we select all coefficients corresponding to all positions up to a certain maximum frequency, we will not exploit the spatial adaptivity property of the wavelet basis. (2) Any linear estimation procedure  ... 
arXiv:1409.5557v1 fatcat:6zf736mjrjfankystsaku7tpqi

Statistical estimation: from denoising to sparse regression and hidden cliques [chapter]

Andrea Montanari
2015 Statistical Physics, Optimization, Inference, and Message-Passing Algorithms  
Andrea Montanari on the topic of statistical estimation for linear models. The first two lectures cover the principles of signal recovery from linear measurements in terms of minimax risk.  ...  We will then cover more recent research, and discuss sparse linear regression in Section 4.3.1, and its analysis for random designs in Section 5.2.  ...  If we select all coefficients corresponding to all positions up to a certain maximum frequency, we will not exploit the spatial adaptivity property of the wavelet basis. (2) Any linear estimation procedure  ... 
doi:10.1093/acprof:oso/9780198743736.003.0005 fatcat:4imw7drb3zbjff43aaw2b6lkei

Online Orthogonal Matching Pursuit [article]

El Mehdi Saad, Gilles Blanchard, Sylvain Arlot
2021 arXiv   pre-print
Greedy algorithms for feature selection are widely used for recovering sparse high-dimensional vectors in linear models.  ...  We present a novel online algorithm: Online Orthogonal Matching Pursuit (OOMP) for online support recovery in the random design setting of sparse linear regression.  ...  The straightforward formulation of sparse regression using a l 0 − pseudo-norm constraint is computationally intractable.  ... 
arXiv:2011.11117v2 fatcat:3dj3nk4svbbohkjsi6fh4rt4kq

ASC

Amos Waterland, Elaine Angelino, Ryan P. Adams, Jonathan Appavoo, Margo Seltzer
2014 Proceedings of the 19th international conference on Architectural support for programming languages and operating systems - ASPLOS '14  
Linear regression is most useful when our system needs to predict integer-valued features such as loop induction variables, while logistic regression is more general and attempts to predict any bit whatsoever  ...  Predictors at the feature level share state between related bits to make this efficient.  ... 
doi:10.1145/2541940.2541985 dblp:conf/asplos/WaterlandAAAS14 fatcat:hv5jp5ep25ebnmptpmcy6ltmim

Sparse Representation for Wireless Communications: A Compressive Sensing Approach

Zhijin Qin, Jiancun Fan, Yuanwei Liu, Yue Gao, Geoffrey Ye Li
2018 IEEE Signal Processing Magazine  
Sparse representation can efficiently model signals in different applications to facilitate processing.  ...  With the help of the sparsity property, CS is able to enhance the spectrum efficiency and energy efficiency for the fifth generation (5G) networks and Internet of Things (IoT) networks.  ...  It has been pointed out that verifying both the RIP condition and incoherence property is computationally complicated but they could be achieved with a high probability simply by selecting Φ as a random  ... 
doi:10.1109/msp.2018.2789521 fatcat:wamvxn7kebavtjroau4ksfbwea
« Previous Showing results 1 — 15 out of 200 results