Abstract
Conjugate gradient methods have been widely used as schemes to solve large-scale unconstrained optimization problems. The search directions for the conventional methods are defined by using the gradient of the objective function. This paper proposes two nonlinear conjugate gradient methods which take into account mostly information about the objective function. We prove that they converge globally and numerically compare them with conventional methods. The results show that with slight modification to the direction, one of our methods performs as well as the best conventional method employing the Hestenes–Stiefel formula.
Similar content being viewed by others
References
Bongartz I., Conn A.R., Gould N.I.M., Toint P.L.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)
Cragg E.E., Levy A.V.: Study on a supermemory gradient method for the minimization of functions. J. Optim. Theory Appl. 4, 191–205 (1969)
Dai Y.H., Liao L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai Y.H., Yuan Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dai Y.H., Yuan Y.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)
Dolan E.D., Moré J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher R., Reeves C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Ford J.A., Narushima Y., Yabe H.: Multi-step nonlinear conjugate gradient methods for unconstrained minimization. Comput. Optim. Appl. 40, 191–216 (2008)
Gilbert J.C., Nocedal J.: Global convergence properties of conjugate gradient method for optimization. SIAM J. Optim. 2, 21–42 (1992)
Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr web site. http://cuter.rl.ac.uk/cuter-www/index.html
Hager W.W., Zhang H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager W.W., Zhang H.: A survey of nonlinear conjugate gradient methods. Pacific J. Optim. 2, 35–58 (2006)
Hager W.W., Zhang H.: Algorithm 851: CG_DESCENT: a conjugategradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)
Hager, W.W., Zhang, H.: CG_DESCENT Version 1.4 User’ Guide, University of Florida, November 2005. http://www.math.ufl.edu/~hager/
Hestenes M.R., Stiefel E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)
Huang H.X., Liang Z.A., Pardalos P.M.: Flow search approach and new bounds for the m-step linear conjugate gradient algorithm. J. Optim. Theory Appl. 120, 53–71 (2004)
Iiduka H.: Hybrid conjugate gradient method for a convex optimization problem over the fixed-point set of a nonexpansive mapping. J. Optim. Theory Appl. 140, 463–475 (2009)
Iiduka, H., Uchida, M.: Fixed point optimization algorithms for network bandwidth allocation problems with compoundable constraints. IEEE Communications Letters (to appear)
Iiduka H., Yamada I.: A use of conjugate gradient direction for the convex optimization problem over the fixed point set of a nonexpansive mapping. SIAM J. Optim. 19, 1881–1893 (2009)
Liu, G., Nocedal, J., Waltz, R.: CG+ web site (1998). http://users.eecs.northwestern.edu/~nocedal/CG+.html
Miele A., Cantrell J.W.: Study on a memory gradient method for the minimization of functions. J. Optim. Theory Appl. 3, 459–470 (1969)
Narushima Y., Yabe H.: Global convergence of a memory gradient method for unconstrained optimization. Comput. Optim. Appl. 35, 325–346 (2006)
Narushima Y., Yabe H., Ford J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)
Nazareth J.L.: A conjugate direction algorithm for unconstrained minimization without line searches. J. Optim. Theory Appl. 23, 373–387 (1977)
Nocedal J., Wright S.J.: Numerical Optimization (Second Edition). Springer Series in Operations Research, Springer, New York (2006)
Pardalos P.M., Resende M.G.C.: Handbook of Applied Optimization. Oxford University Press, Oxford (2002)
Polak E., Ribière G.: Note sur la convergence de directions conjugées Rev. Francaise Informat Recherche Opertionelle 3e anné 16, 35–43 (1969)
Yabe H., Sakaiwa N.: A new nonlinear conjugate gradient method for unconstrained optimization. J. Oper. Res. Soc. Japan 48, 284–296 (2005)
Yabe H., Takano M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)
Yuan G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)
Zhang L., Zhou W., Li D.H.: A descent modified Polak-Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Num. Anal. 26, 629–640 (2006)
Zhang L., Zhou W., Li D.H.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)
Zhang L., Zhou W., Li D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22, 697–711 (2007)
Zhou W., Zhang L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21, 707–714 (2006)
Zoutendijk G.: Nonlinear programming. In: Abadie, J. (eds) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Iiduka, H., Narushima, Y. Conjugate gradient methods using value of objective function for unconstrained optimization. Optim Lett 6, 941–955 (2012). https://doi.org/10.1007/s11590-011-0324-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-011-0324-0