Abstract
The spectral conjugate gradient method is effective iteration method for solving large-scale unconstrained optimizations. In this paper, using the strong Wolfe line search to yield the spectral parameter, and giving two approaches to choose the conjugate parameter, then two classes of spectral conjugate gradient methods are established. Under usual assumptions, the proposed methods are proved to possess sufficient descent property and global convergence. Taking some specific existing conjugate parameters to test the validity of the two classes of methods, and choosing the best method from each class to compare with other efficient conjugate gradient methods, respectively. Large-scale numerical results for the experiments are reported, which show that the proposed methods are promising.












Similar content being viewed by others
References
Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)
Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)
Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141, 249–264 (2009)
Andrei, N.: New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization. J. Comput. Appl. Math. 234(12), 3397–3410 (2010)
Babaie-Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett 8(8), 2285–2297 (2014)
Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)
Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)
Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)
Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)
Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hestenes, M.R., Stiefel, E.: Method of conjugate gradient for solving linear equations. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)
Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)
Liu, J.K., Li, S.J.: New hybrid conjugate gradient method for unconstrained optimization. Appl. Math. Comput. 245, 36–43 (2014)
Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)
Liu, Y.F., Zhu, Z.B., Zhang, B.X.: Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing. J. Appl. Math. Comput. (2021). https://doi.org/10.1007/s12190-021-01589-8
Jian, J., Chen, Q., Jiang, X., Zeng, Y., Yin, J.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)
Jian, J.B., Liu, P.J., Jiang, X.Z.: A spectral three-term conjugate gradient method with sufficient descent property. Acta Math. 43, 1000–1012 (2020). Appl. Sin. (Chin. Ser.)
Jian, J.B., Liu, P.J., Jiang, X.Z., He, B.: Two improved nonlinear conjugate gradient methods with the strong Wolfe line search. Bull. Iran. Math. Soc. (2021). https://doi.org/10.1007/s41980-021-00647-y
Jiang, X.Z., Jian, J.B.: Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization. Nonlinear Dyn. 77(1–2), 387–397 (2014)
Jiang, X.Z., Jian, J.B.: Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. J. Comput. Appl. Math. 348, 525–534 (2019)
Jiang, X.Z., Jian, J.B., Song, D., Liu, P.J.: An improved Polak–Ribière–Polyak conjugate gradient method with an efficient restart direction. Comput. Appl. Math. 40, 174 (2021)
Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)
Morè, J., Garbow, B.S., Hillstrome, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)
Polak, B., Ribir̀e, G.: Note surla convergence des mèthodes de directions conjugèes. Rev. Fr. Inf. Rech. Oper. 3(1), 35–43 (1969)
Tang, C., Li, S., Cui, Z.: Least-squares-based three-term conjugate gradient methods. J. Inequal. Appl. 27, 1–22 (2020)
Zhang, L.: An improved Wei–Yao–Liu nonlinear conjugate gradient method for optimization computation. Appl. Math. Comput. 215(6), 2269–2274 (2009)
Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)
Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)
Zhu, Z.B., Ma, J.Y., Zhang, B.X.: A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery. Comput. Appl. Math. 39, 270 (2020)
Zhu, Z.B., Zhang, D.D., Wang, S.: Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl. Math. Comput. 373(15), 125004 (2020)
Zoutendijk, G.: Nonlinear Programming, Computational Methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This work is supported in part by the National Natural Science Foundation of China (No. 12171106), and in part by the Natural Science Foundation of Guangxi Province (Nos. 2020GXNSFDA238017, 2018GXNSFFA281007), and in part by Research Project of Guangxi University for Nationalities (No. 2018KJQD02).
Rights and permissions
About this article
Cite this article
Jian, J., Liu, P., Jiang, X. et al. Two classes of spectral conjugate gradient methods for unconstrained optimizations. J. Appl. Math. Comput. 68, 4435–4456 (2022). https://doi.org/10.1007/s12190-022-01713-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12190-022-01713-2
Keywords
- Unconstrained optimization
- Spectral conjugate gradient method
- Strong Wolfe line search
- Global convergence