Abstract
Based on the insight gained from the three-term conjugate gradient methods suggested by Zhang et al. (Optim Methods Softw 22:697–711, 2007) two nonlinear conjugate gradient methods are proposed, making modifications on the conjugate gradient methods proposed by Dai and Liao (Appl Math Optim 43:87–101, 2001), and Zhou and Zhang (Optim Methods Softw 21:707–714, 2006). The methods can be regarded as modified versions of two three-term conjugate gradient methods proposed by Sugiki et al. (J Optim Theory Appl 153:733–757, 2012) in which the search directions are computed using the secant equations in a way to achieve the sufficient descent property. One of the methods is shown to be globally convergent for uniformly convex objective functions while the other is shown to be globally convergent without convexity assumption on the objective function. Comparative numerical results demonstrating efficiency of the proposed methods are reported.
Similar content being viewed by others
References
Andrei, N.: Numerical comparison of conjugate gradient algorithms for unconstrained optimization. Stud. Inform. Control 16(4), 333–352 (2007)
Andrei, N.: Another hybrid conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms 47(2), 143–156 (2008)
Andrei, N.: Open problems in conjugate gradient algorithms for unconstrained optimization. B. Malays. Math. Sci. So. 34(2), 319–330 (2011)
Babaie-Kafaki, S., Ghanbari, R., Mahdavi-Amiri, N.: Two new conjugate gradient methods based on modified secant equations. J. Comput. Appl. Math. 234(5), 1374–1386 (2010)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43(1), 87–101 (2001)
Dolan, E.D., More, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2 Ser. A), 201–213 (2002)
Du, D., Pardalos, P.M., Wu, W.: Mathematical Theory of Optimization. Kluwer Academic Publishers, Dordrecht (2001)
Gould, N.I.M., Orban, D., Toint, PhL: CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W.W., Zhang, H.: Algorithm 851: CG\_Descent, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stands. 49(6), 409–436 (1952)
Li, D.H., Fukushima, M.: A modified BFGS method and its global convergence in nonconvex minimization. J. Comput. Appl. Math. 129(1–2), 15–35 (2001)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153(3), 733–757 (2012)
Sun, W., Yuan, Y.X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006)
Zhang, L., Zhou, W., Li, D.H.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Zhou, W., Zhang, L.: A nonlinear conjugate gradient method based on the MBFGS secant condition. Optim. Methods Softw. 21(5), 707–714 (2006)
Acknowledgments
This research was supported by Research Councils of Semnan University and Ferdowsi University of Mashhad. The authors are grateful to Professor William W. Hager for providing the CG_Descent code. They also thank the anonymous referees for their valuable suggestions helped to improve the quality of this work.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Babaie-Kafaki, S., Ghanbari, R. Two modified three-term conjugate gradient methods with sufficient descent property. Optim Lett 8, 2285–2297 (2014). https://doi.org/10.1007/s11590-014-0736-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-014-0736-8