Skip to main content
Log in

On the sufficient descent property of the Shanno’s conjugate gradient method

  • Short Communication
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Satisfying in the sufficient descent condition is a strength of a conjugate gradient method. Here, it is shown that under the Wolfe line search conditions the search directions generated by the memoryless BFGS conjugate gradient algorithm proposed by Shanno satisfy the sufficient descent condition for uniformly convex functions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

References

  1. Andrei N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)

    MathSciNet  Google Scholar 

  2. Andrei N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andrei N.: Scaled conjugate gradient algorithms for unconstrained optimization. Comput. Optim. Appl. 38(3), 401–416 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  4. Andrei N.: Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Optim. Methods Softw. 22(4), 561–571 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204(3), 410–420 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Birgin E., Martínez J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  7. Broyden C.G.: The convergence of a class of double-rank minimization algorithms. II. The new algorithm. J. Inst. Math. Appl. 6(1), 222–231 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  8. Dai Y.H., Han J.Y., Liu G.H., Sun D.F., Yin H.X., Yuan Y.X.: Convergence properties of nonlinear conjugate gradient methods. SIAM J. Optim. 10(2), 348–358 (1999)

    MathSciNet  Google Scholar 

  9. Dai Y.H., Liao L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai Y.H., Ni Q.: Testing different conjugate gradient methods for large-scale unconstrained optimization. J. Comput. Math. 22(3), 311–320 (2003)

    MathSciNet  Google Scholar 

  11. Dai Y.H., Yuan Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  12. Du D., Pardalos P.M., Wu W.: Mathematical Theory of Optimization. Kluwer Academic Publishers, Dordrecht (2001)

    Book  MATH  Google Scholar 

  13. Fletcher R.: A new approach to variable metric algorithms. Comput. J. 13(3), 317–322 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  15. Goldfarb D.: A family of variable-metric methods derived by variational means. Math. Comput. 24(109), 23–26 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  16. Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hager W.W., Zhang H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  18. Hestenes M.R., Stiefel E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bureau Standards 49(6), 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  19. Perry A.: A modified conjugate gradient algorithm. Oper. Res. 26(6), 1073–1078 (1976)

    Article  MathSciNet  Google Scholar 

  20. Powell, M.J.D.: Nonconvex minimization calculations and the conjugate gradient method. In: Griffiths, D.F. (ed.) Numerical Analysis (Dundee, 1983). Lecture Notes in Mathematics, vol. 1066, pp. 122–141. Springer, Berlin (1984)

  21. Shanno D.F.: Conditioning of quasi-Newton methods for function minimization. Math. Comput. 24(111), 647–656 (1970)

    Article  MathSciNet  Google Scholar 

  22. Shanno D.F.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3(3), 244–256 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  23. Sun W., Yuan Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, (2006)

  24. Wolfe P.: Convergence conditions for ascent methods. SIAM Rev. 11(2), 226–235 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  25. Wolfe P.: Convergence conditions for ascent methods, II. Some corrections. SIAM Rev. 13(2), 185–188 (1971)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Saman Babaie-Kafaki.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Babaie-Kafaki, S. On the sufficient descent property of the Shanno’s conjugate gradient method. Optim Lett 7, 831–837 (2013). https://doi.org/10.1007/s11590-012-0462-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-012-0462-z

Keywords

Navigation