Some modified conjugate gradient methods for unconstrained optimization

https://doi.org/10.1016/j.cam.2016.04.004Get rights and content
Under an Elsevier user license
open archive

Abstract

Conjugate gradient methods are highly useful for solving large scale optimization problems because they do not require the storage of any matrices. Motivated by the construction of conjugate gradient parameters in some existing conjugate gradient methods, we propose four modified conjugate gradient methods, named NVLS, NVPRP*, NVHS* and NVLS* respectively, and prove that these methods with the strong Wolfe line search possess sufficient descent property, and are globally convergent when the parameter in line search conditions is restricted in some suitable interval. Preliminary numerical results show that the NVPRP*, NVHS* and NVLS* methods are more efficient than many existing conjugate gradient methods for a large number of test problems from a CUTEr collection.

MSC

90C30
90C25

Keywords

Unconstrained optimization
Conjugate gradient method
Line search
Sufficient descent property
Global convergence

Cited by (0)

This work is partially supported by the Natural Science Foundation of China (Grant 11171363) and the Natural Science Foundation of Chongqing (Grants cstc2015jcyjBX0029, cstc2013jjB00001, cstc2013jcyjA00021).