关键词:
The BFGS method;line search;the global complexity bound
摘要:
We present a BFGS type method for solving the unconstrained optimization problem, which is a combination of the MBFGS method and the CBFGS method proposed by Li and Fukushima in [4, 5]. This hybrid scheme sufficiently utilizes advantages of both methods, that is, it not only reduces to the standard BFGS method for local strongly convex functions finally but also regularizes nonconvex functions in the singular case. We show that the proposed method converges globally and superlinearly. Moreover, we investigate a global complexity bound for this method, which is O(epsilon(-2)). Some preliminary numerical results are also reported.
期刊:
Abstract and Applied Analysis,2015年2015:1-2 ISSN:1085-3375
通讯作者:
Yuan, G.
作者机构:
[Gaohang Yu] School of Mathematics and Computer Sciences, Gannan Normal University, Ganzhou 341000, China;[Li Zhang] Department of Mathematics, Changsha University of Science and Technology, Changsha 410004, China;[Yunhai Xiao] College of Mathematics and Information Science, Henan University, Kaifeng 475000, China;[Neculai Andrei] Research Institute for Informatics (ICI), 8-10 Averescu Avenue, 71316 Bucharest 1, Romania;[Gonglin Yuan] College of Mathematics and Information Science, Guangxi University, Nanning, Guangxi 530004, China
通讯机构:
[Yuan, G.] C;College of Mathematics and Information Science, Guangxi UniversityChina
摘要:
It is well-known that the HS method and the PRP method may not converge for nonconvex optimization even with exact line search. Some globalization techniques have been proposed, for instance, the PRP+ globalization technique and the Grippo–Lucidi globalization technique for the PRP method. In this paper, we propose a new efficient globalization technique for general nonlinear conjugate gradient methods for nonconvex minimization. This new technique utilizes the information of the previous search direction sufficiently. Under suitable conditions, we prove that the nonlinear conjugate gradient methods with this new technique are globally convergent for nonconvex minimization if the line search satisfies Wolfe conditions or Armijo condition. Extensive numerical experiments are reported to show the efficiency of the proposed technique.
通讯机构:
[Zhou, Weijun] C;Changsha Univ Sci & Technol, Coll Math & Computat Sci, Changsha 410076, Hunan, Peoples R China.
关键词:
Lithium;Optimization;Efficient;Global convergences;Line searches;MBFGS method;Nonconvex;Nonconvex minimization;Nonconvex minimizations;Nonmonotone line search;Numerical experiments;Test problems;Unconstrained minimizations;Convergence of numerical methods
摘要:
In this paper, we propose it new nonmonotone Armijo type line search and prove that the MBFGS method proposed by Li and Fukushima with this new fine search converges globally for nonconvex minimization. Some numerical experiments show that this nonmonotone MBFGS method is efficient for the given test problems. (C) 2007 Elsevier B.V. All rights reserved.
摘要:
Although the Liu-Storey (LS) nonlinear conjugate gradient method has a similar structure as the well-known Polak-Ribiere-Polyak (PRP) and Hestenes-Stiefel (HS) methods, research about this method is very rare. In this paper, based on the memoryless BFGS quasi-Newton method, we propose a new LS type method, which converges globally for general functions with the Grippo-Lucidi line search. Moreover, we modify this new LS method such that the modified scheme is globally convergent for nonconvex minimization if the strong Wolfe line search is used. Numerical results are also reported. (C) 2008 Elsevier B.V. All rights reserved.
摘要:
In this paper, we propose two new hybrid nonlinear conjugate gradient methods, which produce sufficient descent search direction at every iteration. This property depends neither on the line search used nor on the convexity of the objective function. Under suitable conditions, we prove that the proposed methods converge globally for general nonconvex functions. The numerical results show that both hybrid methods are efficient for the given test problems from the CUTE library. (C) 2007 Elsevier B.V. All rights reserved.
关键词:
Function evaluation;Global optimization;Global convergence;Three term conjugate gradient method;Convergence of numerical methods
摘要:
In this paper, we propose a three-term conjugate gradient method which can produce sufficient descent condition, that is, [image omitted]. This property is independent of any line search used. When an exact line search is used, this method reduces to the standard Hestenes-Stiefel conjugate gradient method. We also introduce two variants of the proposed method which still preserve the sufficient descent property, and prove that these two methods converge globally with standard Wolfe line search even if the minimization function is nonconvex. We also report some numerical experiment to show the efficiency of the proposed methods.
期刊:
Optimization Methods and Software,2007年22(3):511-517 ISSN:1055-6788
通讯作者:
Zhou, Weijun
作者机构:
[Zhou, Weijun] Changsha Univ Sci & Technol, Coll Math & Computat Sci, Changsha 410077, Peoples R China.;Hunan Univ, Coll Math & Econometr, Changsha 410082, Peoples R China.
通讯机构:
[Zhou, Weijun] C;Changsha Univ Sci & Technol, Coll Math & Computat Sci, Changsha 410077, Peoples R China.
关键词:
Constrained optimization;Function evaluation;Gradient methods;Numerical methods;Problem solving;Search engines;Armijo line search;DY methods;Global convergence;Unconstrained optimization problems;Convergence of numerical methods
摘要:
Generally, global convergence of the conjugate gradient methods for unconstrained optimization problems needs Wolfe line search or strong Wolfe line search. In this article, we propose a cautious DY conjugate gradient method and prove that this method with Armijo line search converges globally if the objective function has Lipschitz continuous gradients. We also present some preliminary numerical results to show the efficiency of the proposed method.