Skip to main content
Log in

On Restart Procedures for the Conjugate Gradient Method

  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

The conjugate gradient method is a powerful solution scheme for solving unconstrained optimization problems, especially for large-scale problems. However, the convergence rate of the method without restart is only linear. In this paper, we will consider an idea contained in [16] and present a new restart technique for this method. Given an arbitrary descent direction d t and the gradient g t , our key idea is to make use of the BFGS updating formula to provide a symmetric positive definite matrix P t such that d t =−P t g t , and then define the conjugate gradient iteration in the transformed space. Two conjugate gradient algorithms are designed based on the new restart technique. Their global convergence is proved under mild assumptions on the objective function. Numerical experiments are also reported, which show that the two algorithms are comparable to the Beale–Powell restart algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. E.M.L. Beale, A derivative of conjugate gradients, in: Numerical Methods for Nonlinear Optimization, ed. F.A. Lootsma (Academic Press, London, 1972) pp. 39–43.

    Google Scholar 

  2. E.G. Birgin and J.M. Martínez, A spectral conjugate gradient method for unconstrained optimization, Appl. Math. Optim. 43 (2001) 117–128.

    Google Scholar 

  3. G.C. Broyden, The convergence of a class of double rank minimization algorithms: 2. The new algorithms, J. Inst. Math. Appl. 6 (1970) 222–231.

    Google Scholar 

  4. A. Buckley, Conjugate gradient methods, in: Nonlinear Optimization, ed. M.J.D. Powell (Academic Press, London, 1982) pp. 17–22.

    Google Scholar 

  5. A. Cohen, Rate of convergence of several conjugate gradient algorithms, SIAM J. Numer. Anal. 9 (1972) 248–259.

    Google Scholar 

  6. H.P. Crowder and P. Wolfe, Linear convergence of the conjugate gradient method, IBM J. Res. Development 16 (1972) 431–433.

    Google Scholar 

  7. Y.H. Dai and Y. Yuan, Convergence properties of the Beale-Powell restart algorithm, Sci. China Ser. A 41 (1998) 1142–1150.

    Google Scholar 

  8. Y.H. Dai and Y. Yuan, A three-parameter family of nonlinear conjugate gradient methods, Math. Comp. (to appear).

  9. R. Fletcher and C. Reeves, Function minimization by conjugate gradients, Comput. J. 7 (1964) 149–154.

    Google Scholar 

  10. J.C. Gilbert and J. Nocedal, Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim. 2(1) (1992) 21–42.

    Google Scholar 

  11. G.P. McCormick and K. Ritter, Alternative proofs of the convergence properties of the conjugategradient method, J. Optim. Theory Appl. 13(5) (1975) 497–518.

    Google Scholar 

  12. M.F. McGuire and P. Wolfe, Evaluating a restart procedure for conjugate gradients, Report RC-4382, IBM Research Center, Yorktown Heights (1973).

    Google Scholar 

  13. J.J. Moré, B.S. Garbow and K.E. Hillstrom, Testing unconstrained optimization software, ACM Trans. Math. Software 7 (1981) 17–41.

    Google Scholar 

  14. E. Polak and G. Ribière, Note sur la convergence de directions conjugées, Rev. Française Inform. Rech. Oper. 3e Année 16 (1969) 35–43.

    Google Scholar 

  15. B.T. Polyak, The conjugate gradient method in extremal problems, USSR Comput. Math. Math. Phys. 9 (1969) 94–112.

    Google Scholar 

  16. M.J.D. Powell, Restart procedures of the conjugate gradient method, Math. Program. 2 (1977) 241–254.

    Google Scholar 

  17. M.J.D. Powell, Nonconvex minimization calculations and the conjugate gradient method, in: Lecture Notes in Mathematics, Vol. 1066 (Springer, New York, 1984) pp. 122–141.

    Google Scholar 

  18. D.F. Shanno and K.H. Phua, Remark on algorithm 500: Minimization of unconstrained multivariate functions, ACM Trans. Math. Software 6 (1980) 618–622.

    Google Scholar 

  19. D.F. Shanno, Conjugate gradient methods with inexact line searches, Math. Oper. Res. 3 (1978) 244–256.

    Google Scholar 

  20. Y. Yuan, Numerical Methods for Nonliner Programming (Shanghai Scientific & Technical, 1993) (in Chinese).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yu-Hong Dai.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Dai, YH., Liao, LZ. & Li, D. On Restart Procedures for the Conjugate Gradient Method. Numerical Algorithms 35, 249–260 (2004). https://doi.org/10.1023/B:NUMA.0000021761.10993.6e

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/B:NUMA.0000021761.10993.6e

Navigation