A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Fast large-scale optimization by unifying stochastic gradient and quasi-Newton methods
[article]
2014
arXiv
pre-print
We present an algorithm for minimizing a sum of functions that combines the computational efficiency of stochastic gradient descent (SGD) with the second order curvature information leveraged by quasi-Newton ...
, low dimensional subspace. ...
Sum-of-Functions-Optimizer. ...
arXiv:1311.2115v7
fatcat:z535budljvakbavnfqlimhzshu
Numerical integration in statistical decision-theoretic methods for robust design optimization
2008
Structural And Multidisciplinary Optimization
It concludes that straightforward application of standard off-the-shelf optimization software to robust design is prohibitively expensive, necessitating adaptive strategies and the use of surrogates. ...
Using a prototypical robust design optimization problem, this study explores the computational cost of multidimensional integration (computing expectation) and its interplay with optimization algorithms ...
integrator based on a product rule of one dimensional adaptive Newton-Cotes integrators. ...
doi:10.1007/s00158-007-0189-0
fatcat:qhmacxajhrgs7dsy5bcszyn7fe
Page 3852 of Mathematical Reviews Vol. , Issue 96f
[page]
1996
Mathematical Reviews
96f:90100
96f:90100 90C30 65K05
Lin, Meng Xiong (PRC-ASBJ-C; Beijing) ;
Shou, Nan Qi (PRC-FZHN; Fuzhou)
A diagonal quasi-Newton algorithm for unconstrained optimization. (Chinese. English summary)
J. ...
Theory Appl. 84 (1995), no. 1, 145-169; MR 95i:49022], in which stability for parametric Stackelberg problems, with generic con-
385.
stra low
96f: Der Cor sum Opi Sur whi the pro cor cen wh fiec
dir lov ...
Node Localization of Wireless Sensor Networks Based on Hybrid Bat-Quasi-Newton Algorithm
2015
International Journal of Online Engineering (iJOE)
Concerning the problem that the least square method in the third stage of DV-Hop algorithm has low positioning accuracy, a localization algorithm was proposed which is the fusion of hybrid bat-quasi-Newton ...
First of all, the Bat Algorithm ( BA) was improved from two aspects: firstly, the random vector β was adjusted adaptively according to bats fitness so that the pulse frequency had the adaptive ability. ...
ACKNOWLEDGMENT The authors thank the support from the Key Laboratory of Advanced Process Control for Light Industry (Ministry of Education), the Fundamental Research Funds for the Central Universities. ...
doi:10.3991/ijoe.v11i6.5110
fatcat:5oqqiakmizaljmrimh7jxfroeu
ARTIFICIAL NEURAL NETWORK: A REVIEW
2020
International Journal of Technical Research & Science
Also, optimization methods like Gradient Descent, Newton Method, Conjugate Gradient Method, Quasi Newton and Levenberg Marquardt are presented. ...
In this paper an introduction of Artificial Neural Network is presented. Learning Algorithms like Supervised Algorithms, Reinforcement Algorithms and Unsupervised Algorithms are discussed. ...
It helps in the convergence of algorithm. It is faster than gradient descent method [14] . It can used for large NN.
Quasi Newton Quasi Newton is one of the best methods for large NN. ...
doi:10.30780/specialissue-icaccg2020/007
fatcat:nvc6v53pivax7lpqrfkhupu2ju
A fast procedure for calculating importance weights in bootstrap sampling
2011
Computational Statistics & Data Analysis
For instance, interior point and adaptive barrier methods must cope with forming, storing, and inverting the Hessian of the objective function. ...
Importance sampling is an efficient strategy for reducing the variance of certain bootstrap estimates. ...
The LBFGS algorithm depends as Algorithm 3 Quasi-Newton Acceleration of an Algorithm Map F for Minimizing the Objective Function O Refer to Table 2 well on the number q of secant conditions selected ...
doi:10.1016/j.csda.2010.04.019
pmid:21076658
pmcid:PMC2976546
fatcat:uukfqvuk5rc2tnn3xwbryj6gze
Ancient numerical daemons of conceptual hydrological modeling: 2. Impact of time stepping schemes on model analysis and prediction
2010
Water Resources Research
continuity to permit parameter optimization using efficient quasi Newton methods. ...
When implemented within a multistart framework, modern Newton-type optimizers are robust even when started far from the optima and provide valuable diagnostic insights not directly available from evolutionary ...
In general, an N d -dimensional function cannot be uniquely summarized solely by analysis of its lower-dimensional subspace or marginals. ...
doi:10.1029/2009wr008896
fatcat:bma2dds66zggngv6yfz7tbmyhy
Softened Approximate Policy Iteration for Markov Games
2016
International Conference on Machine Learning
This paper reports theoretical and empirical investigations on the use of quasi-Newton methods to minimize the Optimal Bellman Residual (OBR) of zero-sum two-player Markov Games. ...
Consequently, new algorithms are proposed, making use of quasi-Newton methods to minimize the OBR and the POBR so as to take benefit of enhanced empirical performances at low cost. ...
Let us first describe Newton's and quasi-Newton methods. Newton's method is an optimization technique aiming at minimizing a function f : R n → R. ...
dblp:conf/icml/PerolatPGSP16
fatcat:g7r73yb4ffg7pgxh3nykvkozxe
Nys-Newton: Nyström-Approximated Curvature for Stochastic Optimization
[article]
2022
arXiv
pre-print
with that of state-of-the-art first-order and stochastic quasi-Newton methods. ...
In this study, we propose an approximate Newton sketch-based stochastic optimization algorithm for large-scale empirical risk minimization. ...
Owing to the low sensitivity of the choice of hyperparameters, stochastic quasi-Newton methods have gained popularity in recent years. ...
arXiv:2110.08577v2
fatcat:raamknccpjgira3z2pepfcbcti
A benchmark study on the efficiency of unconstrained optimization algorithms in 2D-aerodynamic shape design
2017
Cogent Engineering
In this work, we benchmark several unconstrained optimization algorithms (Nelder-Mead, Quasi-Newton, steepest descent) under variation of gradient estimation schemes (adjoint equations, finite differences ...
makes it a perfect choice for a low number of design variables. ...
Even if used in many cases, the method of weighted sums can only be a proper method to condense a low number of design goals, not more than 2 or 3, into a single goal function. ...
doi:10.1080/23311916.2017.1354509
fatcat:i52i7frahjhr7pyhhb4dm5dysq
Apollo: An Adaptive Parameter-wise Diagonal Quasi-Newton Method for Nonconvex Stochastic Optimization
[article]
2021
arXiv
pre-print
In this paper, we introduce Apollo, a quasi-Newton method for nonconvex stochastic optimization, which dynamically incorporates the curvature of the loss function by approximating the Hessian via a diagonal ...
Importantly, the update and storage of the diagonal approximation of Hessian is as efficient as adaptive first-order optimization methods with linear complexity for both time and memory. ...
Appendix: Apollo: An Adaptive Parameter-wise Diagonal Quasi-Newton
Method for Nonconvex Stochastic Optimization
Appendix A. ...
arXiv:2009.13586v6
fatcat:amo5fj3uingldbsnvr5ubl6dpq
Optimal substation location and energy distribution network design using a hybrid GA-BFGS algorithm
2005
IEE Proceedings - Generation Transmission and Distribution
A hybrid algorithm combines a quasi-Newton and a genetic algorithm for the optimal expansion of energy distribution systems. ...
The quasi-Newton procedure actuates in the search for optimal substation location co-ordinates, and the genetic algorithm procedure actuates in the design of optimal network topologies. ...
The BFGS quasi-Newton algorithm searches for the optimal substation location. The one-dimensional search, within the quasi-Newton algorithm, considers the network topology fixed. ...
doi:10.1049/ip-gtd:20050036
fatcat:2t7loil4trgq3jps33fanrrstm
Adaptive eigenspace method for inverse scattering problems in the frequency domain
2017
Inverse Problems
Then, standard Newton or quasi-Newton methods from nonlinear optimization can be applied [25, 17] . ...
1, 22] , with an adaptive eigenspace representation of u for regularization. ...
The time-harmonic inverse medium problem is formulated as a PDE-constrained optimization problem and solved by an inexact truncated Newton or quasi-Newton iteration. ...
doi:10.1088/1361-6420/aa5250
fatcat:de2oscklmvefzm5b4523oo2fya
A Survey of Optimization Methods from a Machine Learning Perspective
[article]
2019
arXiv
pre-print
Optimization, as an important part of machine learning, has attracted much attention of researchers. ...
The systematic retrospect and summary of the optimization methods from the perspective of machine learning are of great significance, which can offer guidance for both developments of optimization and ...
Quasi-Newton Method [93] Quasi-Newton method uses an approximate matrix to approximate the the Hessian matrix or its inverse matrix. Popular quasi-Newton methods include DFP, BFGS and LBFGS. ...
arXiv:1906.06821v2
fatcat:rcaas4ccpbdffhuvzcg2oryxr4
A Nested Partitioning Procedure for Numerical Multiple Integration
1981
ACM Transactions on Mathematical Software
A typical iteration of an unconstrained quasi-Newton method begins with the current iterate, x; the gradient vector of f, g; and an approximation to the Hessian, the matrix B. ...
(x x i ) (15) i=l I Each function fi(xi,xi+l) is represented by an adaptive piecewise constant approximation on the plane. ...
doi:10.1145/355934.355939
fatcat:tpxxa7zacna5lacxtzdqnasaxy
« Previous
Showing results 1 — 15 out of 9,448 results