Filters








12 Hits in 9.1 sec

Global convergence of a modified Broyden family method for nonconvex functions

Gonglin Yuan, Zhan Wang, Pengyuan Li
2021 Journal of Industrial and Management Optimization  
line search is obtained for general functions.  ...  However, the study of the global convergence of the Broyden family method is not sufficient. In this paper, a new Broyden family method is proposed based on the BFGS formula of Yuan and Wei (Comput.  ...  The authors would like to thank the referees for their good and valuable suggestions which greatly improve this paper.  ... 
doi:10.3934/jimo.2021164 fatcat:qdwnfo5yszccxn2xp4ajfx7say

MODIFIED LIMITED MEMORY BFGS METHOD WITH NONMONOTONE LINE SEARCH FOR UNCONSTRAINED OPTIMIZATION

Gonglin Yuan, Zengxin Wei, Yanlin Wu
2010 Journal of the Korean Mathematical Society  
The global convergence of the given methods will be established under suitable conditions.  ...  The global convergence of the convex objective function was established. Numerical results show that this method is more competitive to the normal BFGS method with monotone line search.  ...  The global convergence and the superlinear convergence of these two methods for nonconvex have been established under appropriate conditions (see [22, 23] in detail).  ... 
doi:10.4134/jkms.2010.47.4.767 fatcat:fxdw4pj5lnb67hk4jw2sjcc6hm

Positive Definiteness of Symmetric Rank 1 (H-Version) Update for Unconstrained Optimization

2021 Baghdad Science Journal  
Several attempts have been made to modify the quasi-Newton condition in order to obtain rapid convergence with complete properties (symmetric and positive definite) of the inverse of Hessian matrix (second  ...  The positive definite property for the inverse of Hessian matrix is very important to guarantee the existence of the minimum point of the objective function and determine the minimum value of the objective  ...  problem for nonconvex functions based on a new modified weak Wolfe -Powell line search technique.  ... 
doi:10.21123/bsj.2022.19.2.0297 fatcat:r4l6esl2zre2lob4kxkj6qzydy

Optimal power flow: a bibliographic survey I

Stephen Frank, Ingrida Steponavice, Steffen Rebennack
2012 Energy Systems, Springer Verlag  
Part II of the survey examines the recent trend towards stochastic, or non-deterministic, search techniques and hybrid methods for OPF.  ...  In this two-part survey, we survey both the classical and recent OPF literature in order to provide a sound context for the state of the art in OPF formulation and solution methods.  ...  Vanti and Gonzaga (2003) proposed a modified PDIPM which uses a merit function to enhance the convergence properties of earlier PDIPMs for OPF.  ... 
doi:10.1007/s12667-012-0056-y fatcat:icsiux73qvffrh3mxxnutpsg54

A Survey of Optimization Methods from a Machine Learning Perspective [article]

Shiliang Sun, Zehui Cao, Han Zhu, Jing Zhao
2019 arXiv   pre-print
A lot of work on solving optimization problems or improving optimization methods in machine learning has been proposed successively.  ...  The systematic retrospect and summary of the optimization methods from the perspective of machine learning are of great significance, which can offer guidance for both developments of optimization and  ...  the Wolfe conditions, which is a set of inequalities for inexact line searches min ηt f (θ t + η t d t ) [132] .  ... 
arXiv:1906.06821v2 fatcat:rcaas4ccpbdffhuvzcg2oryxr4

Hybrid Evolutionary Computation for Continuous Optimization [article]

Hassan A. Bashir, Richard S. Neville
2013 arXiv   pre-print
The report proposes enhancements in: i) the evolutionary algorithm, ii) a new convergence detection mechanism was proposed; and iii) in the methods for evaluating the search directions and step sizes for  ...  Preliminary results justify that an adept hybridization of evolutionary algorithms with a suitable local search method, could yield a robust and efficient means of solving wide range of global optimization  ...  of global search methods on a particular parameter setting for each problem category.  ... 
arXiv:1303.3469v1 fatcat:adao7rqvm5hs5ir5u2x3eyz4mi

Gao_columbia_0054D_16071.pdf [article]

2020
We provide simple proofs of convergence, including superlinear convergence for adaptive BFGS, allowing us to obtain superlinear convergence without line searches.  ...  This defines a family of methods, Block BFGS, that form a spectrum between the classical BFGS method and Newton's method, in terms of the amount of curvature information used.  ...  (Wolfe condition) (3.6.3) Under the assumption of Armijo-Wolfe line searches, Powell [41] proves the following global convergence theorem for BFGS.  ... 
doi:10.7916/d8-b0jg-3972 fatcat:ff2mwru5y5bf5ld2ex3v75hrty

Abstracts of Working Papers in Economics

1994 Abstracts of Working Papers in Economics  
These also allow calculation of expected inflation. The estimation is performed for a period of five weeks including the date of sterling's exit from the ERM.  ...  They viewed the French Thermidor as a blueprint for Russia's post-revolutionary development.  ...  In the study, we estimate a reduced form of job search model using different specifications of the hazard function.  ... 
doi:10.1017/s0951007900006379 fatcat:qmqzliobt5f25onlm3id3yywaq

Nonlinear hyperelasticity-based mesh optimisation

Jordi Paul, Technische Universität Dortmund, Technische Universität Dortmund
2017
Different existing methods are presented, with the focus on a class of nonlinear mesh quality functionals that can guarantee the orientation preserving property.  ...  Because of the considerable numerical effort, a class of linear preconditioners is developed that helps to speed up the solution process.  ...  line search methods.  ... 
doi:10.17877/de290r-17940 fatcat:xufzctz4kjhinjtx6yzyjnjoqy

New probabilistic inference algorithms that harness the strengths of variational and Monte Carlo methods

Peter Carbonetto
2009
that achieves significant improvements in accuracy and reductions in variance over existing Monte Carlo and variational methods, and at a comparable computational expense, 2) for many instances of the  ...  The four main technical contributions of this thesis are: 1) a new framework for inference in probabilistic models based on stochastic approximation, variational methods and sequential Monte Carlo is proposed  ...  Since satisfaction of the Wolfe conditions cannot be guaranteed, I employ the damped updates proposed in a 1978 paper by Powell. The derivation of the damped BFGS update is straightforward.  ... 
doi:10.14288/1.0051537 fatcat:6dxbpekyznas3afs2iqhpdh6ae

Conference Program and Book of Abstracts 13th International Conference on Stochastic Programming Organising Committee Advisory Program Committee Scientific Committee WELCOME FROM THE CHAIR

Marida Bertocchi -Chair, Giorgio Consigli, Moriggia Vittorio, Ortobelli Sergio, Rosella Lozza, Giacometti, Teresa Maria, Vespucci, Francesca Maggioni, Paolo Pisciella, Stefano Zigrino, Sebastiano Vitali (+42 others)
unpublished
This extension maintains the global convergence property on uniformly convex functions for the L-BFGS method, which uses values of the steplength which satisfy the Wolfe-Powell conditions.  ...  for Large Scale Optimization In 2011, we extended the damped BFGS method of Powell (1978) , which is useful for solving constrained optimization problems that uses Lagrange functions (see for example  ...  For a given historical data set, we construct two types of confidence sets for the ambiguous distribution through nonparametric statistical estimation of its moments and density functions, depending on  ... 
fatcat:b2q6nnj52nbudkna2lpsjmhtlq

Solving forward and inverse Helmholtz equations via controllability methods

Jet Hoe Tang, Marcus J. Grote, Martin J. Gander
2020 unpublished
Grote for giving me the opportunity of doing my PhD and for his guidance, advice, and support.  ...  Acknowledgements This thesis was written at the Department of Mathematics and Computer Science at the University of Basel and was partly supported by the Swiss National Science Foundation.  ...  To determine the line step ρ , we use the Wolfe-Powell or Armijo line search methods [117] . For H = Hess( Ĵ)[β ], the Hessian matrix of Ĵ, (5.18) corresponds to the classical Newton iteration.  ... 
doi:10.5451/unibas-007171297 fatcat:6adma25frfeidpljfnehnqmcra