Filters








16,108 Hits in 4.9 sec

Online Linear Optimization with Sparsity Constraints

Jun-Kun Wang, Chi-Jen Lu, Shou-De Lin
2019 International Conference on Algorithmic Learning Theory  
We study the problem of online linear optimization with sparsity constraints in the semi-bandit setting.  ...  It can be seen as a marriage between two well-known problems: the online linear optimization problem and the combinatorial bandit problem.  ...  More precisely, we consider the general problem of online sparse linear optimization, in which the feasible set consists of w ∈ R d satisfying a sparsity constraint w 0 ≤ k as well as an L b -norm constraint  ... 
dblp:conf/alt/WangLL19 fatcat:dqeamw42mfbzjijp2y5o6vjrrq

Sparsity-Constrained Transportation Problem [article]

Annie I. Chen, Stephen C. Graves
2014 arXiv   pre-print
We study the solution of a large-scale transportation problem with an additional constraint on the sparsity of inbound flows.  ...  We propose a computationally efficient algorithm that solves this sparsity-constrained optimization problem while bypassing complexities of the conventional integer programming approach.  ...  Specifically, the algorithm starts with the relaxed linear program of (P) without the sparsity constraint (1).  ... 
arXiv:1402.2309v1 fatcat:gezzbqtzzncitphlbhijsyqj6e

Fast greedy algorithms for dictionary selection with generalized sparsity constraints [article]

Kaito Fujii, Tasuku Soma
2018 arXiv   pre-print
Not only does our algorithm work much faster than the known methods, but it can also handle more complex sparsity constraints, such as average sparsity.  ...  Using numerical experiments, we show that our algorithm outperforms the known methods for dictionary selection, achieving competitive performances with dictionary learning algorithms in a smaller running  ...  For a related problem in sparse optimization, namely online linear regression, Kale et al.  ... 
arXiv:1809.02314v1 fatcat:b3s4nmrk45aklnnl76i5rfsvjq

Provably optimal sparse solutions to overdetermined linear systems with non-negativity constraints in a least-squares sense by implicit enumeration

Fatih S. Aktaş, Ömer Ekmekcioglu, Mustafa Ç. Pinar
2021 Optimization and Engineering  
The objective of the present paper is to report on an efficient and modular implicit enumeration algorithm to find provably optimal solutions to the NP-hard problem of sparsity-constrained non-negative  ...  Most of the previous research efforts aimed at approximating the sparsity constrained linear least squares problem, and/or finding local solutions by means of descent algorithms.  ...  Provably optimal sparse solutions to overdetermined linear… Comparison with Nonconvex Gauge Function Approach As with the sparsity constraint, a constraint based on the convex Gauge function is defined  ... 
doi:10.1007/s11081-021-09676-2 fatcat:dkhqute2mvcflhprvhmx3eluke

Non-convexly constrained image reconstruction from nonlinear tomographic X-ray measurements

T. Blumensath, R. Boardman
2015 Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences  
our conjugate gradient algorithm that uses the linear model with two different constraints, wavelet sparsity and wavelet tree sparsity.  ...  While these non-convex constraints do not guarantee globally optimal solutions, we could show that when we initialized our method with a good linear reconstruction, then the nonconvexly constraint nonlinear  ... 
doi:10.1098/rsta.2014.0393 pmid:25939619 pmcid:PMC4424487 fatcat:xq6n6srm5zg7zkvrxlbbacqt5m

Minisymposium — Recent progress in regularization theory

A. Neubauer
2009 Journal of Inverse and Ill-Posed Problems  
Ronny Ramlau, Regularization of inverse problems with sparsity constraints.  ...  We had the following seven contributions in that field (in alphabetical order): Kristian Bredies, An iterative thresholding-like algorithm for inverse problems with sparsity constraints in Banach space  ... 
doi:10.1515/jiip.2009.001 fatcat:auobejnmczbqndt2a3qte43cie

Think out of the "Box": Generically-Constrained Asynchronous Composite Optimization and Hedging

Pooria Joulani, András György, Csaba Szepesvári
2019 Neural Information Processing Systems  
Hedge in online learning), and, to our knowledge, the first asynchronous algorithm enjoying linear speed-ups under sparsity with non-SGD-style updates.  ...  ASYNCADA is, to our knowledge, the first asynchronous stochastic optimization algorithm with finite-time datadependent convergence guarantees for generic convex constraints.  ...  We further use this framework to derive the first asynchronous online and stochastic optimization algorithm with non-box constraints that uses non-Euclidean regularizers.  ... 
dblp:conf/nips/JoulaniGS19 fatcat:vkaxfi5fzfcjtbnjwsb7xibmpq

Exploiting Model Sparsity in Adaptive MPC: A Compressed Sensing Viewpoint [article]

Monimoy Bujarbaruah, Charlott Vallon
2019 arXiv   pre-print
The efficacy of the developed algorithm is highlighted with a thorough numerical example, where we demonstrate performance gain over the counterpart algorithm of [2], which does not utilize the sparsity  ...  Using tools from distributionally robust optimization, we reformulate the probabilistic output constraints as tractable convex second-order cone constraints, which enables us to pose our MPC design task  ...  Notice that joint (linear) chance constraints (3b) can be reformulated into a set of individual (linear) chance constraints using Bonferroni's inequality, and can therefore also be addressed by our proposed  ... 
arXiv:1912.04408v1 fatcat:dgqwf34gabh57m5snrdevksohq

Block-sparse beamforming for spatially extended sources in a Bayesian formulation

Angeliki Xenaki, Efren Fernandez-Grande, Peter Gerstoft
2016 Journal of the Acoustical Society of America  
Optimization methods solve the DOA estimation problem as a least-squares parameter estimation problem and use constraints to regularize it.  ...  SINGLE SNAPSHOT DOA ESTIMATION We consider the simple one-dimensional (1D) problem with a uniform linear array (ULA) of sensors and the sources residing in the plane of the array.  ... 
doi:10.1121/1.4962325 pmid:27914408 fatcat:jhgrkcvc35gbhfzfyrcklqqxya

SOML: Sparse Online Metric Learning with Application to Image Retrieval

Xingyu Gao, Steven C.H. Hoi, Yongdong Zhang, Ji Wan, Jintao Li
2014 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In thispaper, we propose a novel Sparse Online Metric Learning (SOML)scheme for learning sparse distance functions from large-scalehigh-dimensional data and explore its application to imageretrieval.  ...  with sparsity is potentially more effective than the existing online metric learning without exploiting sparsity.  ...  Thus, we consider the problem of online metric learning with a linear similarity function S defined as: S M (x i , x j ) ≡ x T i Mx j (4) where M ∈ R m×m .  ... 
doi:10.1609/aaai.v28i1.8911 fatcat:a523gi4iubejdopb7maf4mgfza

Sparse LMS via online linearized Bregman iteration

Tao Hu, Dmitri B. Chklovskii
2014 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
We demonstrate that OLBI is bias free and compare its operation with existing sparse LMS algorithms by rederiving them in the online convex optimization framework.  ...  Our algorithm called online linearized Bregman iteration (OLBI) is derived from minimizing the cumulative prediction error squared along with an l 1 -l 2 norm regularizer.  ...  In many cases such prediction can be computed as a linear combination of the input signal vector, , with the weight vector, .  ... 
doi:10.1109/icassp.2014.6855000 dblp:conf/icassp/HuC14 fatcat:u35ndg33sraehhlafta7wsrj2i

Generalized thresholding sparsity-aware algorithm for low complexity online learning

Yannis Kopsinis, Konstantinos Slavakis, Sergios Theodoridis, Steve McLaughlin
2012 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
In this paper, a novel scheme for online, sparsity-aware learning is presented.  ...  The complexity of the algorithm exhibits a linear dependence on the number of free parameters.  ...  In fact, to the best of our knowledge, there is not any adaptive algorithm, of linear computational complexity, capable of dealing with such constraints.  ... 
doi:10.1109/icassp.2012.6288615 dblp:conf/icassp/KopsinisSTM12 fatcat:af3zb7khkjgsjiprda2k2owkhi

A Batchwise Monotone Algorithm for Dictionary Learning [article]

Huan Wang, John Wright, Daniel Spielman
2015 arXiv   pre-print
Unlike the state-of-the-art dictionary learning algorithms which impose sparsity constraints on a sample-by-sample basis, we instead treat the samples as a batch, and impose the sparsity constraint on  ...  Experiments on both natural image patches and UCI data sets show that the proposed algorithm produces a better approximation with the same sparsity levels compared to the state-of-the-art algorithms.  ...  The advantage of column-wise sparsity constraints is that it leads naturally to fast online algorithms. Whenever a new sample comes one simply adds a sparsity constraint on the incoming column of X.  ... 
arXiv:1502.00064v1 fatcat:4zsgjsv5q5hy5pwzqgh3dkun2a

Online robust image alignment via iterative convex optimization

Yi Wu, Bin Shen, Haibin Ling
2012 2012 IEEE Conference on Computer Vision and Pattern Recognition  
The efficacy of the proposed online robust alignment algorithm is verified with extensive experiments on image set alignment and visual tracking, in reference with state-of-the-art methods.  ...  While inheriting the benefits of sparsity, our method enjoys the great time efficiency and therefore be capable of dealing with large image set and real time tasks such as visual tracking.  ...  Iterative Convex Optimization. The optimization problem (1) is non-convex in that the constraint ⃗ I • τ = Ax + e is nonlinear in τ ∈ G.  ... 
doi:10.1109/cvpr.2012.6247878 dblp:conf/cvpr/WuSL12 fatcat:u5qttkzd5jgvfhmvyy7dmo5chy

Weight, Block or Unit? Exploring Sparsity Tradeoffs for Speech Enhancement on Tiny Neural Accelerators [article]

Marko Stamenovic, Nils L. Westhausen, Li-Chia Yang, Carl Jensen, Alex Pawlicki
2021 arXiv   pre-print
Our method supports all three structures above and jointly learns integer quantized weights along with sparsity.  ...  We explore network sparsification strategies with the aim of compressing neural speech enhancement (SE) down to an optimal configuration for a new generation of low power microcontroller based neural accelerators  ...  All combinations that meet the target sparsity constraint are evaluated on the test set and the optimal sparsity is chosen based on a heuristic combining STOI [35] , PESQ [1] and SI-SDR [23] : Q =  ... 
arXiv:2111.02351v2 fatcat:3sfua4wchbbfxkdxklwu4akzfi
« Previous Showing results 1 — 15 out of 16,108 results