The Internet Archive has a preservation copy of this work in our general collections.
The file type is application/pdf
.
Filters
Information-theoretic lower bounds on the oracle complexity of stochastic convex optimization
[article]
2011
arXiv
pre-print
In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. ...
Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. ...
We also thank the anonymous reviewers for helpful suggestions, and corrections to our results and for pointing out the optimality of our bounds in the primal-dual norm setting. ...
arXiv:1009.0571v3
fatcat:jdm7cmzcefexffqb6si2d5ulqu
Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
2012
IEEE Transactions on Information Theory
In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation. ...
Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. ...
In addition, MJW received funding from the Air Force Office of Scientific Research (AFOSR-09NL184). We also thank the anonymous reviewers for the helpful suggestions, and corrections to our results. ...
doi:10.1109/tit.2011.2182178
fatcat:756z6hmwbjbfhneaxvmvkzowha
Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization
2017
Proceedings of the Twenty-Eighth Annual ACM-SIAM Symposium on Discrete Algorithms
We study the complexity of stochastic convex optimization given only statistical query (SQ) access to the objective function. ...
For many cases of interest we derive nearly matching upper and lower bounds on the estimation (sample) complexity including linear optimization in the most general setting. ...
On the other hand lower bounds for stochastic oracles (e.g. [1] ) have a very different nature and it is impossible to obtain superpolynomial lower bounds on the number of oracle calls (such as those we ...
doi:10.1137/1.9781611974782.82
dblp:conf/soda/FeldmanGV17
fatcat:y5uaxnh6sbfvnnhosnkx2upyc4
The Complexity of Making the Gradient Small in Stochastic Convex Optimization
[article]
2019
arXiv
pre-print
We give nearly matching upper and lower bounds on the oracle complexity of finding ϵ-stationary points (∇ F(x) ≤ϵ) in stochastic convex optimization. ...
This allows us to decompose the complexity of finding near-stationary points into optimization complexity and sample complexity, and reveals some surprising differences between the complexity of stochastic ...
Part of this work was completed while DF was at Cornell University and supported by the Facebook Ph.D. fellowship. OS is partially supported by a European Research Council (ERC) grant. ...
arXiv:1902.04686v2
fatcat:ddl3n2zfcfardg2yjethtyozni
Exploring the intersection of active learning and stochastic convex optimization
2013
2013 IEEE Global Conference on Signal and Information Processing
First order stochastic convex optimization is an extremely well-studied area with a rich history of over a century of optimization research. ...
Over the last year, we have uncovered concrete theoretical and algorithmic connections between these two fields, due to their inherently sequential nature and decision-making based on feedback of earlier ...
FIRST-ORDER STOCHASTIC CONVEX OPTIMIZATION Consider an unknown function f on a bounded set S ⊂ R d , with minimizer x * = arg min x∈S f (x) that is k-uniformly convex (k-UC) and L-Lipschitz, i.e. for constants ...
doi:10.1109/globalsip.2013.6737091
dblp:conf/globalsip/RamdasS13
fatcat:vqt2b75si5aifirdimk7xoropy
Information complexity of black-box convex optimization: A new look via feedback information theory
2009
2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton)
As a bonus, we give a particularly simple derivation of the minimax lower bound for a certain active learning problem on the unit interval. ...
As a bonus, we give a particularly simple derivation of the minimax lower bound for a certain active learning problem on the unit interval. ...
LOWER BOUNDS FOR ARBITRARY ALGORITHMS We now describe our information-theoretic method for determining lower bounds on the information complexity of convex programming. ...
doi:10.1109/allerton.2009.5394945
fatcat:7h6t7spiwfd3zbwdcqw4tpc2ty
Stochastic Continuous Greedy ++: When Upper and Lower Bounds Match
2019
Neural Information Processing Systems
We further provide an information-theoretic lower bound to showcase the necessity of O(1/ 2 ) stochastic oracle queries in order to achieve [(1 − 1/e)OPT − ] for monotone and DR-submodular functions. ...
In this paper, we develop Stochastic Continuous Greedy++ (SCG++), the first efficient variant of a conditional gradient method for maximizing a continuous submodular function subject to a convex constraint ...
Acknowledgment The work of H. Hassani ...
dblp:conf/nips/KarbasiHMS19
fatcat:3dpda5yquve5foospuwhd7cdp4
Reproducibility in Optimization: Theoretical Framework and Limits
[article]
2022
arXiv
pre-print
We then analyze several convex optimization settings of interest such as smooth, non-smooth, and strongly-convex objective functions and establish tight bounds on the limits of reproducibility in each ...
We define a quantitative measure of reproducibility of optimization procedures in the face of noisy or error-prone operations such as inexact or stochastic gradient computations or inexact initialization ...
B Information-theoretic lower bounds B.1 Information-theoretic lower bound for stochastic inexact gradient model We state and prove the information-theoretic lower bound for smooth costs. Theorem 6. ...
arXiv:2202.04598v2
fatcat:rdlgwqq5jvbj5kdv7tvtmsxx7y
Zeroth-order (Non)-Convex Stochastic Optimization via Conditional Gradient and Gradient Updates
2018
Neural Information Processing Systems
In this paper, we propose and analyze zeroth-order stochastic approximation algorithms for nonconvex and convex optimization. ...
Next, we propose a truncated stochastic gradient algorithm with zeroth-order information, whose rate depends only poly-logarithmically on the dimensionality. ...
Note that the linear dependence of our complexity bounds on d is unimprovable due to the lower bounds for zeorth-order algorithms applied to convex optimization problems [7] . ...
dblp:conf/nips/Balasubramanian18
fatcat:3kf3clihxbdsfiihxsyvrtdvu4
Linear Convergence with Condition Number Independent Access of Full Gradients
2013
Neural Information Processing Systems
For smooth and strongly convex optimizations, the optimal iteration complexity of the gradient-based algorithm is O( √ κ log 1/ǫ), where κ is the condition number. ...
In this paper, we propose to remove the dependence on the condition number by allowing the algorithm to access stochastic gradients of the objective function. ...
When the norm of the data is bounded, the smoothness parameter L can be treated as a constant. The strong convexity parameter λ is lower bounded by τ . ...
dblp:conf/nips/0005MJ13
fatcat:ich5elgrwfhtbiwomavs7pheei
Information-Based Complexity, Feedback and Dynamics in Convex Programming
2011
IEEE Transactions on Information Theory
Index Terms-Convex optimization, Fano's inequality, feedback information theory, hypothesis testing with controlled observations, information-based complexity, informationtheoretic converse, minimax lower ...
This, in turn, puts limits on the speed of optimization under specific assumptions on the oracle and the type of feedback. ...
In particular, we would like to thank one reviewer for suggesting the definition of a strong infinite-step algorithm. ...
doi:10.1109/tit.2011.2154375
fatcat:5bopafwkgfg3bme52eyxsh22ja
Information-based complexity, feedback and dynamics in convex programming
[article]
2011
arXiv
pre-print
We study the intrinsic limitations of sequential convex optimization through the lens of feedback information theory. ...
This, in turn, puts limits on the speed of optimization under specific assumptions on the oracle and the type of feedback. ...
In particular, we would like to thank one reviewer for suggesting the definition of a strong infinite-step algorithm. ...
arXiv:1010.2285v3
fatcat:ts5v3wpzqzerthvimioppdd3li
The Min-Max Complexity of Distributed Stochastic Convex Optimization with Intermittent Communication
[article]
2021
arXiv
pre-print
We resolve the min-max complexity of distributed stochastic convex optimization (up to a log factor) in the intermittent communication setting, where M machines work in parallel over the course of R rounds ...
We present a novel lower bound with a matching upper bound that establishes an optimal algorithm. ...
Minimax bounds on stochastic batched convex optimization. ...
arXiv:2102.01583v2
fatcat:yypfhqxtzjdvxn5kua3qtofbqu
Efficient Smooth Non-Convex Stochastic Compositional Optimization via Stochastic Recursive Gradient Descent
2019
Neural Information Processing Systems
Such a complexity is known to be the best one among IFO complexity results for non-convex stochastic compositional optimization. ...
The objective function is the composition of two expectations of stochastic functions, and is more challenging to optimize than vanilla stochastic optimization problems. ...
Future directions include handling the non-smooth case and the theory of lower bounds for stochastic compositional optimization. ...
dblp:conf/nips/YuanLLLH19
fatcat:l3ld7pyycbdjnmdca7jvjxl2qq
The Minimax Complexity of Distributed Optimization
[article]
2021
arXiv
pre-print
In this thesis, I study the minimax oracle complexity of distributed stochastic optimization. ...
First, I present the "graph oracle model", an extension of the classic oracle complexity framework that can be applied to study distributed optimization algorithms. ...
Before we proceed, we also provide a simple lower bound on the "statistical term," which corresponds to the information-theoretic difficulty of optimizing on the basis of only |V| samples. ...
arXiv:2109.00534v1
fatcat:ibkwtyfd3bawzftakx7ebpwod4
« Previous
Showing results 1 — 15 out of 3,796 results