Skip to main content
Log in

A new family of globally convergent conjugate gradient methods

  • SI: Derman: Optimization
  • Published:
Annals of Operations Research Aims and scope Submit manuscript

Abstract

Conjugate gradient methods are an important class of methods for unconstrained optimization, especially for large-scale problems. Recently, they have been much studied. In this paper, a new family of conjugate gradient method is proposed for unconstrained optimization. This method includes the already existing two practical nonlinear conjugate gradient methods, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the Wolfe conditions. The numerical experiments are done to test the efficiency of the new method, which implies the new method is promising. In addition the methods related to this family are uniformly discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Al-Baali, M. (1985). Descent property and global convergence of the fletcherreeves method with inexact line search. IMA Journal of Numerical Analysis, 5(1), 121–124.

    Article  Google Scholar 

  • Al-Baali, M. (1985). New property and global convergence of the fletcher-reeves method with inexact line searches. IMA Journal of Numerical Analysis, 5(1), 122–124.

    Article  Google Scholar 

  • Dai, Y. H., & Yuan, Y. (1996). Convergence properties of the fletcher-reeves method. IMA Journal of Numerical Analysis, 16(2), 155–164.

    Article  Google Scholar 

  • Dai, Y. H., & Yuan, Y. (2001). An efficient hybrid conjugate gradient method for unconstrained optimization. Annals of Operations Research, 103(1–4), 33–47.

    Article  Google Scholar 

  • Dai, Y.-H., & Yuan, Y. (1999). A nonlinear conjugate gradient method with a strong global convergence property. SIAM Journal on Optimization, 10(1), 177–182.

    Article  Google Scholar 

  • Dai, Y., Han, J., Liu, G., Sun, D., Yin, H., & Yuan, Y.-X. (2000). Convergence properties of nonlinear conjugate gradient methods. SIAM Journal on Optimization, 10(2), 345–358.

    Article  Google Scholar 

  • Dai, Y., & Yuan, Y. (2003). A class of globally convergent conjugate gradient methods. Science in China Series A: Mathematics, 46(2), 251–261.

    Article  Google Scholar 

  • Fletcher, R., & Reeves, C. M. (1964). Function minimization by conjugate gradients. The computer Journal, 7(2), 149–154.

    Article  Google Scholar 

  • Fletcher, R. (1987). Practical methods of optimization. Hoboken: Wiley.

    Google Scholar 

  • Gilbert, J. C., & Nocedal, J. (1992). Global convergence properties of conjugate gradient methods for optimization. SIAM Journal on Optimization, 2(1), 21–42.

    Article  Google Scholar 

  • Hestenes, M. R., & Stiefel, E. (1952). Methods of conjugate gradients for solving linear systems. Journal of Research of the National Standards, 49, 409–436.

    Article  Google Scholar 

  • Hu, Y. F., & Storey, C. (1991). Global convergence result for conjugate gradient methods. Journal of Optimization Theory and Applications, 71(2), 399–405.

    Article  Google Scholar 

  • Hillstrome, K. E., More, J. J., & Garbow, B. S. (1981). Testing unconstrained optimization software. Mathematical Software, 7, 17–41.

    Article  Google Scholar 

  • Polyak, B. T. (1969). The conjugate gradient method in extremal problems. USSR Computational Mathematics and Mathematical Physics, 9(4), 94–112.

    Article  Google Scholar 

  • Pu, D., & Yu, W. (1990). On the convergence property of the dfp algorithm. Annals of Operations Research, 24(1), 175–184.

    Article  Google Scholar 

  • Sellami, B., Laskri, Y., & Benzine, R. (2015). A new two-parameter family of nonlinear conjugate gradient methods. Optimization, 64(4), 993–1009.

    Article  Google Scholar 

  • Shanno, D. F. (1978). Conjugate gradient methods with inexact searches. Mathematics of Operations Research, 3(3), 244–256.

    Article  Google Scholar 

  • Zoutendijk, G. (1970). Nonlinear programming, computational methods. Integer and Nonlinear Programming, 143(1), 37–86.

    Google Scholar 

Download references

Acknowledgments

We would like to thank to Professor Paul Armand (University of Limoges, France), who has always been generous with his time and advice.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to B. Sellami.

Ethics declarations

Conflicts of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sellami, B., Chaib, Y. A new family of globally convergent conjugate gradient methods. Ann Oper Res 241, 497–513 (2016). https://doi.org/10.1007/s10479-016-2120-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10479-016-2120-9

Keywords

Mathematics Subject Classification

Navigation