104,547 Hits in 4.6 sec

A Simple Method for Convex Optimization in the Oracle Model [article]

Daniel Dadush, Christopher Hojny, Sophie Huiberts, Stefan Weltge
2021 arXiv   pre-print
We give a simple and natural method for computing approximately optimal solutions for minimizing a convex function f over a convex set K given by a separation oracle.  ...  Our method utilizes the Frank–Wolfe algorithm over the cone of valid inequalities of K and subgradients of f.  ...  Acknowledgments We would like to thank Robert Luce and Sebastian Pokutta for their very valuable feedback on our work.  ... 
arXiv:2011.08557v2 fatcat:kmcsb2za4rdhbf3jpqct6bivbu

Playing Non-linear Games with Linear Oracles

Dan Garber, Elad Hazan
2013 2013 IEEE 54th Annual Symposium on Foundations of Computer Science  
Linear optimization over matroid polytopes, matching polytopes and path polytopes are example of problems for which we have efficient combinatorial algorithms, but whose non-linear convex counterpart is  ...  Linear optimization is many times algorithmically simpler than non-linear convex optimization.  ...  The conditional gradient method and local linear oracles The conditional gradient method is a simple algorithm for minimizing a smooth convex function f over a convex set P, which in this work we assume  ... 
doi:10.1109/focs.2013.52 dblp:conf/focs/GarberH13 fatcat:kdhsloznnfdrbkuvgjih5ljgyy

A Linearly Convergent Conditional Gradient Algorithm with Applications to Online and Stochastic Optimization [article]

Dan Garber, Elad Hazan
2015 arXiv   pre-print
Our main result is a novel conditional gradient algorithm for smooth and strongly convex optimization over polyhedral sets that performs only a single linear optimization step over the domain on each iteration  ...  Linear optimization over matroid polytopes, matching polytopes and path polytopes are example of problems for which we have simple and efficient combinatorial algorithms, but whose non-linear convex counterpart  ...  The authors would like to thank Arkadi Nemirovski for numerous helpful comments on an earlier draft of this paper.  ... 
arXiv:1301.4666v6 fatcat:hlldiddq25cdreskycvqe736ae

Revisiting Frank-Wolfe for Polytopes: Strict Complementarity and Sparsity [article]

Dan Garber
2021 arXiv   pre-print
In recent years it was proved that simple modifications of the classical Frank-Wolfe algorithm (aka conditional gradient algorithm) for smooth convex minimization over convex and compact polytopes, converge  ...  In this paper we first demonstrate that already for very simple problems and even when the optimal solution lies on a low-dimensional face of the polytope, such dependence on the dimension cannot be avoided  ...  For optimization over convex and compact polytopes, in his classical book [29] , Wolfe himself suggested a simple variant of the method that does not only add new vertices to the solution using the linear  ... 
arXiv:2006.00558v4 fatcat:acbtzmmgtrbibiz5cndwnvxzny

Faster Rates for the Frank-Wolfe Method over Strongly-Convex Sets [article]

Dan Garber, Elad Hazan
2015 arXiv   pre-print
In this paper we consider the special case of optimization over strongly convex sets, for which we prove that the vanila FW method converges at a rate of 1/t^2.  ...  It is an active line of research to derive faster linear optimization-based algorithms for various settings of convex optimization.  ...  linear optimization steps over the feasible set.  ... 
arXiv:1406.1305v2 fatcat:nitpfdmbnrde3pes63mwnycqae

Convex Optimization without Projection Steps [article]

Martin Jaggi
2011 arXiv   pre-print
For the general problem of minimizing a convex function over a compact convex domain, we will investigate a simple iterative approximation algorithm based on the method by Frank & Wolfe 1956, that does  ...  The method allows us to understand the sparsity of approximate solutions for any l1-regularized convex optimization problem (and for optimization over the simplex), expressed as a function of the approximation  ...  Credit for the important geometric interpretation of the duality gap over  ... 
arXiv:1108.1170v6 fatcat:w7wiokyl2fdureamsv25oneywu

Approximate Convex Optimization by Online Game Playing [article]

Elad Hazan
2006 arXiv   pre-print
Recently, Bienstock and Iyengar, following Nesterov, gave an algorithm for fractional packing linear programs which runs in 1/ϵ iterations.  ...  The latter algorithm requires to solve a convex quadratic program every iteration - an optimization subroutine which dominates the theoretical running time.  ...  Otherwise, the oracle reduces to optimization of a convex non-linear function over a convex set.  ... 
arXiv:cs/0610119v1 fatcat:ojwgeyc2bza77fsnudtjwt7ywi

The Convex Optimization Approach to Regret Minimization [chapter]

2011 Optimization for Machine Learning  
In this survey we describe two general methods for deriving algorithms and analyzing them, with a "convex optimization flavor".  ...  Recently the design of algorithms for regret minimization in a wide array of settings has been influenced by tools from convex optimization.  ...  For the setting of online linear optimization, we also prove that the two templates are equivalent.  ... 
doi:10.7551/mitpress/8996.003.0012 fatcat:5k3d2hskjze4xmtm3rqt6egk7i

Page 6995 of Mathematical Reviews Vol. , Issue 91M [page]

1991 Mathematical Reviews  
Summary: “A new interior point method for minimizing a convex quadratic function over a polytope is developed. We show that our method requires O(n>5L) arithmetic operations.  ...  ,.--, P¥,---. of nested convex sets that shrink towards the set of optimal solution(s).  ... 

Lift-and-project cuts for convex mixed integer nonlinear programs

Mustafa R. Kılınç, Jeff Linderoth, James Luedtke
2017 Mathematical Programming Computation  
Using this procedure, we are able to approximately 5 optimize over the rank one lift-and-project closure for a variety of convex MINLP 6 instances.  ...  We demonstrate that, as with the approach in [58], 91 this simple approach may fail to find a violated lift-and-project cut when one exists. 92 We therefore propose an iterative method that solves a sequence  ...  In the normal setting, we set = 1 699 for the simple and iterative version of our algorithm and they are denoted as SIMPLE 700 and ITERATIVE, respectively.  ... 
doi:10.1007/s12532-017-0118-1 fatcat:56kgxtboqnbjhi4buzh2tkcmv4

Robust Adaptive Beamforming for General-Rank Signal Model With Positive Semi-Definite Constraint via POTDC

Arash Khabbazibasmenj, Sergiy A. Vorobyov
2013 IEEE Transactions on Signal Processing  
Then, the optimal value function is replaced with another equivalent one, for which the corresponding optimization problem is convex.  ...  The new RAB method shows superior performance compared to the other state-of-the-art general-rank RAB methods.  ...  It is interesting to mention that this iterative linear approximation can be also interpreted in terms of the DC-iteration approach over the single non-convex term .  ... 
doi:10.1109/tsp.2013.2281301 fatcat:ax5egx2q2vh5powzxwyqez3pay

Stochastic subGradient Methods with Linear Convergence for Polyhedral Convex Optimization [article]

Tianbao Yang, Qihang Lin
2016 arXiv   pre-print
In this paper, we show that simple Stochastic subGradient Decent methods with multiple Restarting, named RSGD, can achieve a linear convergence rate for a class of non-smooth and non-strongly convex optimization  ...  To the best of our knowledge, this is the first result on the linear convergence rate of stochastic subgradient methods for non-smooth and non-strongly convex optimization problems.  ...  Acknolwedgements We thank James Renegar for pointing out the connection to his work and for his valuable comments on the difference between the two work.  ... 
arXiv:1510.01444v5 fatcat:3u3w4374e5cqhgjfa6aan6oyue

Unified Framework to Regularized Covariance Estimation in Scaled Gaussian Models

Ami Wiesel
2012 IEEE Transactions on Signal Processing  
Using a simple change of variables, we transform the problem into a convex optimization that can be efficiently solved.  ...  We propose a unified framework for regularizing this estimate in order to improve its finite sample performance. Our approach is based on the discovery of hidden convexity within the ML objective.  ...  Hero, III, for numerous discussions on the topic which led to this work. In particular, Y. Chen provided the majorization-minimization interpretation of Tyler's fixed point iteration.  ... 
doi:10.1109/tsp.2011.2170685 fatcat:dsegehcqlvaqdm7tovbcbfjnzu

Page 5500 of Mathematical Reviews Vol. , Issue 94i [page]

1994 Mathematical Reviews  
For solving the strictly convex quadratic programming problem the algorithm of Hildreth is a well-known iterative method.  ...  Toshihide (J-K YOT; Kyoto) A successive over-relaxation method for quadratic programming problems with interval constraints.  ... 

Page 6957 of Mathematical Reviews Vol. , Issue 2002I [page]

2002 Mathematical Reviews  
Convergence in the consistent case is proven and an application to optimization over linear inequalities is given.”  ...  T. (1-TMPL-C; Philadelphia, PA) Averaging strings of sequential iterations for convex feasibility problems.  ... 
« Previous Showing results 1 — 15 out of 104,547 results