Abstract
Recently, we propose a nonlinear conjugate gradient method, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the weak Wolfe conditions. In this paper, we will study methods related to the new nonlinear conjugate gradient method. Specifically, if the size of the scalar β k with respect to the one in the new method belongs to some interval, then the corresponding methods are proved to be globally convergent; otherwise, we are able to construct a convex quadratic example showing that the methods need not converge. Numerical experiments are made for two combinations of the new method and the Hestenes–Stiefel conjugate gradient method. The initial results show that, one of the hybrid methods is especially efficient for the given test problems.
Similar content being viewed by others
References
M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal. 5 (1985) 121–124.
Y.H. Dai, New properties of a nonlinear conjugate gradient method, Numer. Math. 89 (2001) 83–98.
Y.H. Dai and Y. Yuan, Convergence properties of the Fletcher-Reeves method, IMA J. Numer. Anal. 16(2) (1996) 155–164.
Y.H. Dai and Y. Yuan, Convergence properties of the conjugate descent method, Advances in Mathematics 6 (1996) 552–562.
Y.H. Dai and Y. Yuan, Some properties of a new conjugate gradient method, in: Advances in Nonlinear Programming, ed. Y. Yuan (Kluwer Academic, Boston, 1998) pp. 251–262.
Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optimization 10(1) (1999) 177–182.
R. Fletcher, Practical Methods of Optimization, Vol. 1, Unconstrained Optimization (Wiley, New York, 1987).
R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154.
J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optimization 2(1) (1992) 21–42.
M.R. Hestenes and E.L. Stiefel, Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sect. 5(49) (1952) 409–436.
Y.F. Hu and C. Storey, Global convergence result for conjugate gradient methods, JOTA 71(2) (1991) 399–405.
G.H. Liu, J.Y. Han and H.X. Yin, Global convergence of the Fletcher-Reeves algorithm with an inexact line search, Appl. Math. J. Chinese Univ. Ser. B 10 (1995) 75–82.
J.J. Moré, B.S. Garbow and K.E. Hillstrom, Testing unconstrained optimization software, ACMTransactions on Mathematical Software 7 (1981) 17–41.
E. Polak and G. Ribiière, Note sur la convergence de directions conjugées, Rev. Franiçaise Informat. Recherche Opertionelle, 3e année 16 (1969) 35–43.
B.T. Polyak, The conjugate gradient method in extremem problems, Comput. Math. Math. Phys. 9 (1969) 94–112.
M.J.D. Powell, Restart procedures of the conjugate gradient method, Math. Programming 2 (1977) 241–254.
M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method, in: Lecture Notes in Mathematics 1066 (Springer, Berlin, 1984) pp. 122–141.
D. Touati-Ahmed and C. Storey, Efficient hybrid conjugate gradient techniques, JOTA 64 (1990) 379–397.
P. Wolfe, Convergence conditions for ascent methods, SIAM Review 11 (1969) 226–235.
P. Wolfe, Convergence conditions for ascent methods. II: Some corrections, SIAM Review 13 (1971) 185–188.
G. Zoutendijk, Nonlinear programming, computational methods, in: Integer and Nonlinear Programming, ed. J. Abadie (North-Holland, Amsterdam, 1970) pp. 37–86.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Dai, Y., Yuan, Y. An Efficient Hybrid Conjugate Gradient Method for Unconstrained Optimization. Annals of Operations Research 103, 33–47 (2001). https://doi.org/10.1023/A:1012930416777
Issue Date:
DOI: https://doi.org/10.1023/A:1012930416777