Skip to main content
Log in

A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Based on the memoryless BFGS quasi-Newton method, a family of three-term nonlinear conjugate gradient methods are proposed. For any line search, the directions generated by the new methods are sufficient descent. Using some efficient techniques, global convergence results are established when the line search fulfills the Wolfe or the Armijo conditions. Moreover, the r-linear convergence rate of the methods are analyzed as well. Numerical comparisons show that the proposed methods are efficient for the unconstrained optimization problems in the CUTEr library.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34, 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  2. Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 21, 1–9 (2013)

    MATH  Google Scholar 

  3. Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  Google Scholar 

  4. Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  5. Dai, Y., Liao, L.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  6. Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (2000)

    Article  MathSciNet  Google Scholar 

  7. Dai, Z.: Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Anal. 74, 927–936 (2011)

    Article  MathSciNet  Google Scholar 

  8. Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  Google Scholar 

  9. Fletcher, R.: Practical Method of Optimization, Vol. 1: Unconstrained Optimization. Wiley, New York (1987)

    Google Scholar 

  10. Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  11. Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  Google Scholar 

  12. Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  13. Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  14. Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  15. Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)

    Article  MathSciNet  Google Scholar 

  16. Li, M., Feng, H.: A sufficient descent LS conjugate gradient method for unconstrained optimization problems. Appl. Math. Comput. 218, 1577–1586 (2011)

    MathSciNet  MATH  Google Scholar 

  17. Li, M., Liu, J., Feng, H.: The global convergence of a descent PRP conjugate gradient method. Comput. Appl. Math. 31(1), 59–83 (2012)

    Article  MathSciNet  Google Scholar 

  18. Liu, D., Nocedal, J.: On the limited memory BFGS method for large-scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MathSciNet  Google Scholar 

  19. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 177–182 (1991)

    MATH  Google Scholar 

  20. Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)

    Article  MathSciNet  Google Scholar 

  21. Perry, J.M.: A class of conjugate gradient algorithms with a two-step variable-metric memory. Discussion Paper 269, Center for Mathematical Studies in Economics and Management Sciences, Northwestern University, Evanston, Illinois (1977)

  22. Polak, B., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Francaise Inform. Recherche. Opérat. 16, 35–43 (1969)

    MATH  Google Scholar 

  23. Polyak, B.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)

    Article  Google Scholar 

  24. Powell, M.: Nonvonvex minimization calculations and the conjugate gradient method. In: Lecture Notes in Mathematics, vol. 1066. Springer, Berlin (1984)

    Google Scholar 

  25. Powell, M.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)

    Article  MathSciNet  Google Scholar 

  26. Shanno, D.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)

    Article  MathSciNet  Google Scholar 

  27. Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)

    Article  MathSciNet  Google Scholar 

  28. Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)

    Article  MathSciNet  Google Scholar 

  29. Yu, G., Guan, L.: Modified PRP methods with sufficient descent property and their convergence properties. Acta Scientiarum Naturalium Universitatis Sunyatseni 45(4), 11–14 (2006). (Chinese)

    MathSciNet  MATH  Google Scholar 

  30. Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)

    Article  MathSciNet  Google Scholar 

  31. Zhang, J., Deng, N., Chen, L.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)

    Article  MathSciNet  Google Scholar 

  32. Zhang, L.: New versions of the Hestenes–Stiefel nonlinear conjugate gradient method based on the secant condition for optimization. Comput. Appl. Math. 28, 1–23 (2009)

    Article  MathSciNet  Google Scholar 

  33. Zhang, L., Zhou, W., Li, D.: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)

    Article  MathSciNet  Google Scholar 

  34. Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    MATH  Google Scholar 

Download references

Acknowledgements

This work is supported by the Chinese NSF Grant (No. 11401242) and the SFED Grant (No. 14B139) of Hunan Province.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Min Li.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, M. A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method. Optim Lett 12, 1911–1927 (2018). https://doi.org/10.1007/s11590-017-1205-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-017-1205-y

Keywords

Navigation