Abstract
Based on the memoryless BFGS quasi-Newton method, a family of three-term nonlinear conjugate gradient methods are proposed. For any line search, the directions generated by the new methods are sufficient descent. Using some efficient techniques, global convergence results are established when the line search fulfills the Wolfe or the Armijo conditions. Moreover, the r-linear convergence rate of the methods are analyzed as well. Numerical comparisons show that the proposed methods are efficient for the unconstrained optimization problems in the CUTEr library.
Similar content being viewed by others
References
Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34, 319–330 (2011)
Babaie-Kafaki, S., Reza, G.: A descent family of Dai–Liao conjugate gradient methods. Optim. Methods Softw. 21, 1–9 (2013)
Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)
Dai, Y., Kou, C.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Dai, Y., Liao, L.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai, Y., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (2000)
Dai, Z.: Two modified HS type conjugate gradient methods for unconstrained optimization problems. Nonlinear Anal. 74, 927–936 (2011)
Dolan, E., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R.: Practical Method of Optimization, Vol. 1: Unconstrained Optimization. Wiley, New York (1987)
Fletcher, R., Reeves, C.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gilbert, J., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21–42 (1992)
Hager, W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, G., Tang, C., Wei, Z.: New conjugacy condition and related new conjugate gradient methods for unconstrained optimization. J. Comput. Appl. Math. 202, 523–539 (2007)
Li, M., Feng, H.: A sufficient descent LS conjugate gradient method for unconstrained optimization problems. Appl. Math. Comput. 218, 1577–1586 (2011)
Li, M., Liu, J., Feng, H.: The global convergence of a descent PRP conjugate gradient method. Comput. Appl. Math. 31(1), 59–83 (2012)
Liu, D., Nocedal, J.: On the limited memory BFGS method for large-scale optimization. Math. Program. 45, 503–528 (1989)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 177–182 (1991)
Nocedal, J.: Updating quasi-Newton matrices with limited storage. Math. Comput. 35, 773–782 (1980)
Perry, J.M.: A class of conjugate gradient algorithms with a two-step variable-metric memory. Discussion Paper 269, Center for Mathematical Studies in Economics and Management Sciences, Northwestern University, Evanston, Illinois (1977)
Polak, B., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Francaise Inform. Recherche. Opérat. 16, 35–43 (1969)
Polyak, B.: The conjugate gradient method in extreme problems. USSR Comput. Math. Math. Phys. 9, 94–112 (1969)
Powell, M.: Nonvonvex minimization calculations and the conjugate gradient method. In: Lecture Notes in Mathematics, vol. 1066. Springer, Berlin (1984)
Powell, M.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)
Shanno, D.: Conjugate gradient methods with inexact searches. Math. Oper. Res. 3, 244–256 (1978)
Shanno, D.F.: On the convergence of a new conjugate gradient algorithm. SIAM J. Numer. Anal. 15, 1247–1257 (1978)
Yabe, H., Takano, M.: Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. Comput. Optim. Appl. 28, 203–225 (2004)
Yu, G., Guan, L.: Modified PRP methods with sufficient descent property and their convergence properties. Acta Scientiarum Naturalium Universitatis Sunyatseni 45(4), 11–14 (2006). (Chinese)
Yuan, G.: Modified nonlinear conjugate gradient methods with sufficient descent property for large-scale optimization problems. Optim. Lett. 3, 11–21 (2009)
Zhang, J., Deng, N., Chen, L.: New quasi-Newton equation and related methods for unconstrained optimization. J. Optim. Theory Appl. 102, 147–167 (1999)
Zhang, L.: New versions of the Hestenes–Stiefel nonlinear conjugate gradient method based on the secant condition for optimization. Comput. Appl. Math. 28, 1–23 (2009)
Zhang, L., Zhou, W., Li, D.: A descent modified Polak–Ribière–Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26, 629–640 (2006)
Zoutendijk, G.: Nonlinear programming, computational methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Acknowledgements
This work is supported by the Chinese NSF Grant (No. 11401242) and the SFED Grant (No. 14B139) of Hunan Province.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Li, M. A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method. Optim Lett 12, 1911–1927 (2018). https://doi.org/10.1007/s11590-017-1205-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11590-017-1205-y