Skip to main content
Log in

Conjugate gradient methods using value of objective function for unconstrained optimization

  • Original Paper
  • Published:
Optimization Letters Aims and scope Submit manuscript

Abstract

Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes–Stiefel formula.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bongartz I., Conn A.R., Gould N.I.M., Toint P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  MATH  Google Scholar 

  2. Cragg E.E., Levy A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  3. Dai Y.H., Liao L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  4. Dai Y.H., Yuan Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  5. Dai Y.H., Yuan Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  6. Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  7. Fletcher R., Reeves C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  8. Ford J.A., Narushima Y., Yabe H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  9. Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient method for optimization. SIAM J. Optim. 2, 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  10. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr web site. http://cuter.rl.ac.uk/cuter-www/index.html

  11. Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  12. Hager W.W., Zhang H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2, 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  13. Hager W.W., Zhang H.: Algorithm 851: CG_DESCENT: a conjugategradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)

    Article  MathSciNet  Google Scholar 

  14. Hager, W.W., Zhang, H.: CG_DESCENT Version 1.4 User’ Guide, University of Florida, November 2005. http://www.math.ufl.edu/~hager/

  15. Hestenes M.R., Stiefel E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    MathSciNet  MATH  Google Scholar 

  16. Huang H.X., Liang Z.A., Pardalos P.M.: Flow search approach and new bounds for the m-step linear conjugate gradient algorithm. J. Optim. Theory Appl. 120, 53–71 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  17. Iiduka H.: Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping. J. Optim. Theory Appl. 140, 463–475 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  18. Iiduka, H., Uchida, M.: Fixed point optimization algorithms for network bandwidth allocation problems with compoundable constraints. IEEE Communications Letters (to appear)

  19. Iiduka H., Yamada I.: A use of conjugate gradient direction for the convex optimization problem over the fixed point set of a nonexpansive mapping. SIAM J. Optim. 19, 1881–1893 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  20. Liu, G., Nocedal, J., Waltz, R.: CG+ web site (1998). http://users.eecs.northwestern.edu/~nocedal/CG+.html

  21. Miele A., Cantrell J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969)

    Article  MathSciNet  MATH  Google Scholar 

  22. Narushima Y., Yabe H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  23. Narushima Y., Yabe H., Ford J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  24. Nazareth J.L.: A conjugate direction algorithm for unconstrained minimization without line searches. J. Optim. Theory Appl. 23, 373–387 (1977)

    Article  MathSciNet  MATH  Google Scholar 

  25. Nocedal J., Wright S.J.: Numerical Optimization (Second Edition). Springer Series in Operations Research, Springer, New York (2006)

    Google Scholar 

  26. Pardalos P.M., Resende M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)

    MATH  Google Scholar 

  27. Polak E., Ribière G.: Note sur la convergence de directions conjugées Rev. Francaise Informat Recherche Opertionelle 3e anné 16, 35–43 (1969)

    Google Scholar 

  28. Yabe H., Sakaiwa N.: A new nonlinear conjugate gradient method for unconstrained optimization. J. Oper. Res. Soc. Japan 48, 284–296 (2005)

    MathSciNet  MATH  Google Scholar 

  29. Yabe H., Takano M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  30. Yuan G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  31. Zhang L., Zhou W., Li D.H.: A descent modified Polak-Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Num. Anal. 26, 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  32. Zhang L., Zhou W., Li D.H.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhang L., Zhou W., Li D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)

    Article  MathSciNet  Google Scholar 

  34. Zhou W., Zhang L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21, 707–714 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  35. Zoutendijk G.: Nonlinear programming. In: Abadie, J. (eds) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hideaki Iiduka.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Iiduka, H., Narushima, Y. Conjugate gradient methods using value of objective function for unconstrained optimization. Optim Lett 6, 941–955 (2012). https://doi.org/10.1007/s11590-011-0324-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11590-011-0324-0

Keywords

Navigation