Filters








5,281 Hits in 4.0 sec

Sub-sampled Cubic Regularization for Non-convex Optimization [article]

Jonas Moritz Kohler, Aurelien Lucchi
2017 arXiv   pre-print
To the best of our knowledge this is the first work that gives global convergence guarantees for a sub-sampled variant of cubic regularization on non-convex functions.  ...  We consider the minimization of non-convex functions that typically arise in machine learning. Specifically, we focus our attention on a variant of trust region methods known as cubic regularization.  ...  ≥ log((2d)/δ) ⇔ ≥ 4 √ 2κ f log ((2d)/δ) + 1/4 |S g | . (46) Sub-sampled Cubic Regularization for Non-convex Optimization Conversely, the probability of a deviation of < 4 √ 2κ f log ((2d)/δ) + 1/4 |S  ... 
arXiv:1705.05933v3 fatcat:zx2rrkfgsfhfhd2vhjipljquvi

Sub-sampled Cubic Regularization for Non-convex Optimization

Jonas Kohler, Aurelien Lucchi
2017
To the best of our knowledge this is the first work that gives global convergence guarantees for a sub-sampled variant of cubic regularization on non-convex functions.  ...  We consider the minimization of non-convex functions that typically arise in machine learning. Specifically, we focus our attention on a variant of trust region methods known as cubic regularization.  ...  Cubic regularization and trust region methods. Trust region methods are among the most effective algorithmic frameworks to avoid pitfalls such as local saddle points in non-convex optimization.  ... 
doi:10.3929/ethz-b-000223160 fatcat:7pbtbwxtn5ds5hgxe7uaxzeoyi

Accelerating Adaptive Cubic Regularization of Newton's Method via Random Sampling [article]

Xi Chen, Bo Jiang, Tianyi Lin, Shuzhong Zhang
2022 arXiv   pre-print
In particular, we propose to compute an approximated Hessian matrix by either uniformly or non-uniformly sub-sampling the components of the objective.  ...  As well known, the crux in cubic regularization is its utilization of the Hessian information, which may be computationally expensive for large-scale problems.  ...  In terms of cubic regularized Newton's method for non-convex optimization, the adaptive regularization algorithms with inexact evaluation for both function and derivatives are considered in Bellavia et  ... 
arXiv:1802.05426v3 fatcat:yugw4akmwnbrtolwpcbtotvqmm

SingCubic: Cyclic Incremental Newton-type Gradient Descent with Cubic Regularization for Non-Convex Optimization [article]

Ziqiang Shi
2020 arXiv   pre-print
) method for optimizing non-convex functions.  ...  The results and technique can be served as an initiate for the research on the incremental Newton-type gradient descent methods that employ cubic regularization.  ...  Conclusions This paper introduces a novel cyclic incremental Newton-type gradient descent with cubic regularization method called SingCubic for minimizing non-convex finite sums.  ... 
arXiv:2002.06848v1 fatcat:dspflouj35ct5lx4p2lvbygd4e

Stochastic Variance-Reduced Cubic Regularization for Nonconvex Optimization [article]

Zhe Wang, Yi Zhou, Yingbin Liang, Guanghui Lan
2018 arXiv   pre-print
Cubic regularization (CR) is an optimization method with emerging popularity due to its capability to escape saddle points and converge to second-order stationary solutions for nonconvex optimization.  ...  In this paper, we propose a stochastic variance-reduced cubic-regularization (SVRC) method under random sampling, and study its convergence guarantee as well as sample complexity.  ...  Sub-sampled cu- bic regularization for non-convex optimization. In Proc. 34th International Conference on Machine Learning (ICML), volume 70, pages 1895-1904. Lan, G. (2012).  ... 
arXiv:1802.07372v2 fatcat:ftxphrtjirdoflxqfnr5v2q45a

Cubic Regularization with Momentum for Nonconvex Optimization [article]

Zhe Wang, Yi Zhou, Yingbin Liang, Guanghui Lan
2019 arXiv   pre-print
Our numerical experiments on various nonconvex optimization problems demonstrate that the momentum scheme can substantially facilitate the convergence of cubic regularization, and perform even better than  ...  However, such a successful acceleration technique has not yet been proposed for second-order algorithms in nonconvex optimization.In this paper, we apply the momentum scheme to cubic regularized (CR) Newton's  ...  Sub-sampled cubic regularization for non-convex optimization. In Proc. 34th International Conference on Machine Learning (ICML), volume 70, pages 1895-1904. Li, D.  ... 
arXiv:1810.03763v2 fatcat:cyw6ycinu5bblnxel4ytqxc4om

Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information [article]

Peng Xu, Fred Roosta, Michael W. Mahoney
2019 arXiv   pre-print
We consider variants of trust-region and cubic regularization methods for non-convex optimization, in which the Hessian matrix is approximated.  ...  iteration complexity for the corresponding sub-sampled trust-region and cubic regularization methods.  ...  We then give optimal iteration complexities for Algorithms 1 and 2 for optimization of non-convex finite-sum problems where the Hessian is approximated by means of appropriate sub-sampling (Theorems 4,  ... 
arXiv:1708.07164v4 fatcat:bu7qgd44yzexpkjtwo4u5v237u

Second-Order Optimization for Non-Convex Machine Learning: An Empirical Study [article]

Peng Xu, Farbod Roosta-Khorasani, Michael W. Mahoney
2018 arXiv   pre-print
In this paper, we report detailed empirical evaluations of a class of Newton-type methods, namely sub-sampled variants of trust region (TR) and adaptive regularization with cubics (ARC) algorithms, for  ...  non-convex ML problems.  ...  Gould, and Coralia Cartis for kindly helping us with the code for adaptive cubic regularization as well as setting up GALAHAD package. We also greatly appreciate Dr.  ... 
arXiv:1708.07827v2 fatcat:mrug5vs5x5ac7e7bbq7fthtexe

Finding Local Minimax Points via (Stochastic) Cubic-Regularized GDA: Global Convergence and Complexity [article]

Ziyi Chen, Qunwei Li, Yi Zhou
2022 arXiv   pre-print
Moreover, we propose a stochastic variant of Cubic-GDA for large-scale minimax optimization, and characterize its sample complexity under stochastic sub-sampling.  ...  Then, inspired by the classic cubic regularization algorithm, we propose Cubic-GDA--a cubic-regularized GDA algorithm for finding local minimax points, and provide a comprehensive convergence analysis  ...  Sub-Sampled Stochastic Cubic-GDA In this section, we apply stochastic sub-sampling to further improve the performance and complexity of Cubic-GDA in large-scale nonconvex minimax optimization with big  ... 
arXiv:2110.07098v3 fatcat:qhmfrqh5ijgu3bpoh4kqeom53i

3D Sparse SAR Image Reconstruction Based on Cauchy Penalty and Convex Optimization

Yangyang Wang, Zhiming He, Fan Yang, Qiangqiang Zeng, Xu Zhan
2022 Remote Sensing  
At the same time, the objective function maintains convexity via the convex non-convex (CNC) strategy.  ...  The Cauchy penalty is a non-convex penalty function, which can estimate the target intensity more accurately than L1.  ...  Acknowledgments: The authors would like to thank all reviewers and editors for their comments on this paper. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/rs14102308 fatcat:7wrgknxbvraqzh3zpk5yfywiaa

On Noisy Negative Curvature Descent: Competing with Gradient Descent for Faster Non-convex Optimization [article]

Mingrui Liu, Tianbao Yang
2017 arXiv   pre-print
Moreover, we develop a stochastic algorithm for a finite or infinite sum non-convex optimization problem.  ...  In this paper, we propose to further reduce the number of Hessian-vector products for faster non-convex optimization.  ...  Moreover, we also develop a stochastic algorithm for a stochastic non-convex optimization problem, which only involves computing a sub-sampled gradient and a noisy negative curvature of a sub-sampled Hessian  ... 
arXiv:1709.08571v2 fatcat:ndzw4nwsibaapkenhcdp7ozktm

Inexact Non-Convex Newton-Type Methods [article]

Zhewei Yao, Peng Xu, Farbod Roosta-Khorasani, Michael W. Mahoney
2018 arXiv   pre-print
For solving large-scale non-convex problems, we propose inexact variants of trust region and adaptive cubic regularization methods, which, to increase efficiency, incorporate various approximations.  ...  In the context of finite-sum problems, we then explore randomized sub-sampling methods as ways to construct the gradient and Hessian approximations and examine the empirical performance of our algorithms  ...  gratefully acknowledges the support of DARPA, the Australian Research Council through a Discovery Early Career Researcher Award (DE180100923) and the Australian Research Council Centre of Excellence for  ... 
arXiv:1802.06925v1 fatcat:4usxu5gdmbg2dcpagkc4dqjwji

Convergence of Cubic Regularization for Nonconvex Optimization under KL Property [article]

Yi Zhou and Zhe Wang and Yingbin Liang
2018 arXiv   pre-print
Cubic-regularized Newton's method (CR) is a popular algorithm that guarantees to produce a second-order stationary solution for solving nonconvex optimization problems.  ...  In specific, we characterize the asymptotic convergence rate of various types of optimality measures for CR including function value gap, variable distance gap, gradient norm and least eigenvalue of the  ...  Sub-sampled cubic regularization for non-convex optimization. In Proc. 34th International Conference on Machine Learning (ICML), volume 70, pages 1895-1904.  ... 
arXiv:1808.07382v1 fatcat:7eutnjrplfckzbt5na2satbbbe

Regularized Interpolation for Noisy Images

S. Ramani, P. Thevenaz, M. Unser
2010 IEEE Transactions on Medical Imaging  
We give algorithms to recover the continuously-defined model from noisy samples and also provide a data-driven scheme to determine the optimal amount of regularization.  ...  We validate our method with numerical examples where we demonstrate its superiority over an exact fit as well as the benefit of TV-like non-quadratic regularization over Tikhonov-like quadratic regularization  ...  We thank the reviewers for pointing out enhancements on theoretical and practical aspects which have improved the quality and the presentation of the paper.  ... 
doi:10.1109/tmi.2009.2038576 pmid:20129854 fatcat:n2mup6tg6bdpra5cyg4x5nzqji

Inexact Proximal Cubic Regularized Newton Methods for Convex Optimization [article]

Chaobing Song, Ji Liu, Yong Jiang
2019 arXiv   pre-print
In this paper, we use Proximal Cubic regularized Newton Methods (PCNM) to optimize the sum of a smooth convex function and a non-smooth convex function, where we use inexact gradient and Hessian, and an  ...  inexact subsolver for the cubic regularized second-order subproblem.  ...  Sub-sampled cubic regularization for non-convex optimization. arXiv preprint arXiv:1705.05933, 2017. [Nes98] Yurii Nesterov. Introductory lectures on convex programming volume i: Basic course.  ... 
arXiv:1902.02388v2 fatcat:bvgp77kydnhufhrkfortlaspxa
« Previous Showing results 1 — 15 out of 5,281 results