Filters








2,389 Hits in 6.7 sec

Oracle complexity of second-order methods for smooth convex optimization

Yossi Arjevani, Ohad Shamir, Ron Shiff
2018 Mathematical programming  
Finite-sum optimization problems are ubiquitous in machine learning, and are commonly solved using first-order methods which rely on gradient computations.  ...  Although computing and manipulating Hessians is prohibitive for high-dimensional problems in general, the Hessians of individual functions in finite-sum problems can often be efficiently computed, e.g.  ...  Strongly Convex and Smooth Optimization with a Second-Order Oracle Before presenting our main results for finite-sum optimization problems, we consider the simpler problem of minimizing a single strongly-convex  ... 
doi:10.1007/s10107-018-1293-1 fatcat:wnp3xcugwbgepkpxrsp3yha2zu

Oracle Complexity of Second-Order Methods for Finite-Sum Problems [article]

Yossi Arjevani, Ohad Shamir
2017 arXiv   pre-print
Finite-sum optimization problems are ubiquitous in machine learning, and are commonly solved using first-order methods which rely on gradient computations.  ...  Although computing and manipulating Hessians is prohibitive for high-dimensional problems in general, the Hessians of individual functions in finite-sum problems can often be efficiently computed, e.g.  ...  Second-Order Oracle Complexity Bounds for Finite-Sum Problems We now turn to study finite-sum problems of the form given in Eq. (1), and provide lower bounds on the number of oracle calls required to solve  ... 
arXiv:1611.04982v2 fatcat:vvcg3i7burfn5bksiyo42pltbq

The Power of First-Order Smooth Optimization for Black-Box Non-Smooth Problems [article]

Alexander Gasnikov, Anton Novitskii, Vasilii Novitskii, Farshed Abdukhakimov, Dmitry Kamzolov, Aleksandr Beznosikov, Martin Takáč, Pavel Dvurechensky, Bin Gu
2022 arXiv   pre-print
zeroth-order algorithms for non-smooth convex optimization problems.  ...  Gradient-free/zeroth-order methods for black-box convex optimization have been extensively studied in the last decade with the main focus on oracle calls complexity.  ...  Acknowledgements The work of A.  ... 
arXiv:2201.12289v3 fatcat:zs3goba4qnfenkre5vyawyug6i

A Lower Bound for the Optimization of Finite Sums [article]

Alekh Agarwal, Leon Bottou
2015 arXiv   pre-print
This paper presents a lower bound for optimizing a finite sum of n functions, where each function is L-smooth and the sum is μ-strongly convex.  ...  When the functions involved in this sum are not arbitrary, but based on i.i.d. random data, then we further contrast these complexity results with those for optimal first-order methods to directly optimize  ...  Acknowledgements We would like to thank Lin Xiao, Sham Kakade and Rong Ge for helpful discussions regarding the complexities of various methods.  ... 
arXiv:1410.0723v4 fatcat:352cw5wfknc53asldkjztrw6za

Level-Set Methods for Finite-Sum Constrained Convex Optimization

Qihang Lin, Runchao Ma, Tianbao Yang
2018 International Conference on Machine Learning  
We consider the constrained optimization where the objective function and the constraints are defined as summation of finitely many loss functions.  ...  The special divergence ensures the proximal mapping in each iteration can be solved in a closed form. The total complexity of both level-set methods using the proposed oracle are analyzed.  ...  Acknowledgements The authors thank the reviewers for valuable feedback. Tianbao Yang was partially supported by NSF (IIS-1545995).  ... 
dblp:conf/icml/LinMY18 fatcat:ujvhazge35dzfcqlobc5uactte

Structured Logconcave Sampling with a Restricted Gaussian Oracle [article]

Yin Tat Lee, Ruoqi Shen, Kevin Tian
2021 arXiv   pre-print
For composite densities exp(-f(x) - g(x)), where f has condition number κ and convex (but possibly non-smooth) g admits an RGO, we obtain a mixing time of O(κ d log^3κ d/ϵ), matching the state-of-the-art  ...  we also show a zeroth-order algorithm attains the same query complexity.  ...  RS and KT would like to thank Sinho Chewi for his extremely generous help, in particular insightful conversations which led to our discovery of the gap in Section 3, as well as his suggested fix.  ... 
arXiv:2010.03106v4 fatcat:qgtfuq5jxjbdrjtb2iez4cmu34

Oracle Complexity Separation in Convex Optimization [article]

Anastasiya Ivanova, Evgeniya Vorontsova, Dmitry Pasechnyuk, Alexander Gasnikov, Pavel Dvurechensky, Darina Dvinskikh, Alexander Tyurin
2022 arXiv   pre-print
In the strongly convex case these functions also have different condition numbers, which eventually define the iteration complexity of first-order methods and the number of oracle calls required to achieve  ...  Our general framework covers also the setting of strongly convex objectives, the setting when g is given by coordinate derivative oracle, and the setting when g has a finite-sum structure and is available  ...  Even in the first-order oracle model, the evaluation of the first-order oracle for one term in the sum can be more expensive than for another.  ... 
arXiv:2002.02706v4 fatcat:wjl4kv6jlfdd3jgx4x6nkoehey

First-order methods of smooth convex optimization with inexact oracle

Olivier Devolder, François Glineur, Yurii Nesterov
2013 Mathematical programming  
Finally, we show that the notion of inexact oracle allows the application of first-order methods of smooth convex optimization to solve non-smooth or weakly smooth convex problems.  ...  We introduce the notion of inexact first-order oracle and analyze the behaviour of several first-order methods of smooth convex optimization used with such an oracle.  ...  Keywords Smooth convex optimization, first-order methods, inexact oracle, gradient methods, fast gradient methods, complexity bounds.  ... 
doi:10.1007/s10107-013-0677-5 fatcat:f2ecrevzuza6tb6igw6k4ywje4

Tight Complexity Bounds for Optimizing Composite Objectives [article]

Blake Woodworth, Nathan Srebro
2019 arXiv   pre-print
We provide tight upper and lower bounds on the complexity of minimizing the average of m convex functions using gradient and prox oracles of the component functions.  ...  For non-smooth functions, having access to prox oracles reduces the complexity and we present optimal methods based on smoothing that improve over methods using just gradient accesses.  ...  This technique therefore reduces the task of optimizing an instance of an L-Lipschitz finite sum to that of optimizing an L 2 ǫ -smooth finite sum.  ... 
arXiv:1605.08003v3 fatcat:w7uhhca5qfaxld4gg25nsthezy

Recent Theoretical Advances in Non-Convex Optimization [article]

Marina Danilova, Pavel Dvurechensky, Alexander Gasnikov, Eduard Gorbunov, Sergey Guminov, Dmitry Kamzolov, Innokentiy Shibaev
2021 arXiv   pre-print
overview of recent theoretical results on global performance guarantees of optimization algorithms for non-convex optimization.  ...  Then we consider higher-order and zeroth-order/derivative-free methods and their convergence rates for non-convex optimization problems.  ...  The research was partially supported by the Ministry of Science and Higher Education of the Russian Federation (Goszadaniye) No.075-00337-20-03, project No. 0714-2020-0005.  ... 
arXiv:2012.06188v3 fatcat:6cwwns3pnba5zbodlhddof6xai

Small errors in random zeroth-order optimization are imaginary [article]

Wouter Jongeneel, Man-Chung Yue, Daniel Kuhn
2022 arXiv   pre-print
Most zeroth-order optimization algorithms mimic a first-order algorithm but replace the gradient of the objective function with some noisy gradient estimator that can be computed from a small number of  ...  imaginary parts of the order δ.  ...  Faulwasser for the pointer to the imaginary trick. Bibliography  ... 
arXiv:2103.05478v3 fatcat:p7gwroixp5ch3jf66f4vfsgpye

Accelerated Convex Optimization with Stochastic Gradients: Generalizing the Strong-Growth Condition [article]

Víctor Valls, Shiqiang Wang, Yuang Jiang, Leandros Tassiulas
2022 arXiv   pre-print
for finite-sum problems such as SAGA).  ...  The new condition has the strong-growth condition by Schmidt \& Roux as a special case, and it also allows us to (i) model problems with constraints and (ii) design new types of oracles (e.g., oracles  ...  There are two lines of work on accelerated first-order optimization with stochastic gradients.  ... 
arXiv:2207.11833v1 fatcat:r4agh3vxufbv7brl6tvitqgwcq

Decentralized and Parallel Primal and Dual Accelerated Methods for Stochastic Convex Programming Problems [article]

Darina Dvinskikh, Alexander Gasnikov
2021 arXiv   pre-print
However, for all classes of the objective, the optimality in terms of the number of oracle calls per node takes place only up to a logarithmic factor and the notion of smoothness.  ...  Both for primal and dual oracles, the proposed methods are optimal in terms of the number of communication steps.  ...  The results below generalize [25] on proper accelerated method (STM). Table 2 . The optimal number of stochastic (unbiased) first-order oracle calls µ-str. conv. and L-smooth L-smooth µ-str. conv.  ... 
arXiv:1904.09015v17 fatcat:7j5ueplfsbcshfv75kd7nxndne

Stochastic Conditional Gradient++ [article]

Hamed Hassani, Amin Karbasi, Aryan Mokhtari, Zebang Shen
2020 arXiv   pre-print
We develop Stochastic Frank-Wolfe++ (SFW++), an efficient variant of the conditional gradient method for minimizing a smooth non-convex function subject to a convex body constraint.  ...  In this paper, we consider the general non-oblivious stochastic optimization where the underlying stochasticity may change during the optimization procedure and depends on the point at which the function  ...  However, in these results, the convergence rate explicitly depends on the higher order smoothness parameter. In the constrained case, whether higher order smoothness helps remains unknown.  ... 
arXiv:1902.06992v4 fatcat:b3ojgufr2jbzfhjvlroycl3v44

On variance reduction for stochastic smooth convex optimization with multiplicative noise [article]

Alejandro Jofré, Philip Thompson
2017 arXiv   pre-print
The objective is the sum of a smooth convex function with a convex regularizer. Typically, it is assumed an oracle with an upper bound σ^2 on its variance (OUBV).  ...  For the smooth convex class, we use an accelerated SA method a la FISTA which achieves, given tolerance ϵ>0, the optimal iteration complexity of O(ϵ^-1/2) with a near-optimal oracle complexity of O(ϵ^-  ...  If one oracle call per iteration is assumed, such scheme obtains optimal iteration and oracle complexities of O(ǫ −2 ) for the smooth convex class and of O(ǫ −1 ) for smooth strongly convex class.  ... 
arXiv:1705.02969v4 fatcat:5bgc2xtccbdcdk5iodrhgcdlme
« Previous Showing results 1 — 15 out of 2,389 results