Abstract
Global convergence results are derived for well-known conjugate gradient methods in which the line search step is replaced by a step whose length is determined by a formula. The results include the following cases: (1) The Fletcher–Reeves method, the Hestenes–Stiefel method, and the Dai–Yuan method applied to a strongly convex LC 1 objective function; (2) The Polak–Ribière method and the Conjugate Descent method applied to a general, not necessarily convex, LC 1 objective function.
Similar content being viewed by others
References
M. Al-Baali, Descent property and global convergence of the Fletcher-Reeves method with inexact line search, IMA J. Numer. Anal. 5 (1985) 121–124.
Y.H. Dai, J.Y. Han, G.H. Liu, D.F. Sun, H.X. Yin and Y.X. Yuan, Convergence properties of nonlinear conjugate gradient methods, SIAM J. Optim. 10 (1999) 345–358.
Y.H. Dai and Y. Yuan, Convergence properties of the conjugate descent method, Advances in Mathematics 25 (1996) 552–562.
Y.H. Dai and Y. Yuan, Convergence properties of the Fletcher-Reeves method, IMA J. Numer. Anal. 16 (1996) 155–164.
Y.H. Dai and Y. Yuan, A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optimization 10 (1999) 177–182.
R. Fletcher, Practical Method of Optimization, Vol I: Unconstrained Optimization, 2nd edn. (Wiley, New York, 1987).
R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154.
J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM Journal on Optimization 2 (1992) 21–42.
M.R. Hestenes and E. Stiefel, Method of conjugate gradient for solving linear system, J. Res. Nat. Bur. Stand. 49 (1952) 409–436.
Y. Hu and C. Storey, Global convergence result for conjugate gradient methods, Journal of Optimization Theory and Applications 71 (1991) 399–405.
G. Liu, J. Han and H. Yin, Global convergence of the Fletcher-Reeves algorithm with inexact line search, Appl. Math. J. Chinese Univ. Ser. B 10 (1995) 75–82.
B. Polak, The conjugate gradient method in extreme problems, Comput. Math. Math. Phys. 9 (1969) 94–112.
B. Polak and G. Ribiière, Note sur la convergence des méthodes de directions conjuguées, Rev. Fran. Informat. Rech. Opér. 16 (1969) 35–43.
M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method, in: Lecture Notes in Mathematics 1066 (1984) pp. 121–141.
M.J.D. Powell, Convergence properties of algorithms for nonlinear optimization, SIAM Review 28 (1986) 487–500.
D. Touati-Ahmed and C. Storey, Efficient hybrid conjugate gradient techniques, Journal of Optimization Theory and Applications 64 (1990) 379–397.
P. Wolfe, Convergence conditions for ascent methods, SIAM Review 11 (1969) 226–235.
G. Zoutendijk, Nonlinear programming, computational methods, in: Integer and Nonlinear Programming, ed. J. Abadie (North-Holland, 1970) pp. 37-86.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Sun, J., Zhang, J. Global Convergence of Conjugate Gradient Methods without Line Search. Annals of Operations Research 103, 161–173 (2001). https://doi.org/10.1023/A:1012903105391
Issue Date:
DOI: https://doi.org/10.1023/A:1012903105391