Skip to main content
Log in

Two classes of spectral conjugate gradient methods for unconstrained optimizations

  • Original Research
  • Published:
Journal of Applied Mathematics and Computing Aims and scope Submit manuscript

Abstract

The spectral conjugate gradient method is effective iteration method for solving large-scale unconstrained optimizations. In this paper, using the strong Wolfe line search to yield the spectral parameter, and giving two approaches to choose the conjugate parameter, then two classes of spectral conjugate gradient methods are established. Under usual assumptions, the proposed methods are proved to possess sufficient descent property and global convergence. Taking some specific existing conjugate parameters to test the validity of the two classes of methods, and choosing the best method from each class to compare with other efficient conjugate gradient methods, respectively. Large-scale numerical results for the experiments are reported, which show that the proposed methods are promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Al-Baali, M.: Descent property and global convergence of the Fletcher–Reeves method with inexact line search. IMA J. Numer. Anal. 5(1), 121–124 (1985)

    Article  MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20(6), 645–650 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: Hybrid conjugate gradient algorithm for unconstrained optimization. J. Optim. Theory Appl. 141, 249–264 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  5. Andrei, N.: New accelerated conjugate gradient algorithms as a modification of Dai–Yuan’s computational scheme for unconstrained optimization. J. Comput. Appl. Math. 234(12), 3397–3410 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S.: A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization. 4OR 11(4), 361–374 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  7. Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett 8(8), 2285–2297 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  8. Birgin, E.G., Martínez, J.M.: A spectral conjugate gradient method for unconstrained optimization. Appl. Math. Optim. 43(2), 117–128 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dai, Y.H., Yuan, Y.X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  11. Dai, Y.H., Yuan, Y.X.: An efficient hybrid conjugate gradient method for unconstrained optimization. Ann. Oper. Res. 103, 33–47 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  12. Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  13. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7(2), 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  14. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gould, N.I.M., Orban, D., Toint, P.L.: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  Google Scholar 

  16. Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hestenes, M.R., Stiefel, E.: Method of conjugate gradient for solving linear equations. J. Res. Natl. Bur. Stand. 49(6), 409–436 (1952)

    Article  MATH  Google Scholar 

  18. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  19. Liu, J.K., Li, S.J.: New hybrid conjugate gradient method for unconstrained optimization. Appl. Math. Comput. 245, 36–43 (2014)

    MathSciNet  MATH  Google Scholar 

  20. Liu, J.K., Feng, Y.M., Zou, L.M.: A spectral conjugate gradient method for solving large-scale unconstrained optimization. Comput. Math. Appl. 77(3), 731–739 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  21. Liu, Y.F., Zhu, Z.B., Zhang, B.X.: Two sufficient descent three-term conjugate gradient methods for unconstrained optimization problems with applications in compressive sensing. J. Appl. Math. Comput. (2021). https://doi.org/10.1007/s12190-021-01589-8

    Article  MATH  Google Scholar 

  22. Jian, J., Chen, Q., Jiang, X., Zeng, Y., Yin, J.: A new spectral conjugate gradient method for large-scale unconstrained optimization. Optim. Methods Softw. 32(3), 503–515 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  23. Jian, J.B., Liu, P.J., Jiang, X.Z.: A spectral three-term conjugate gradient method with sufficient descent property. Acta Math. 43, 1000–1012 (2020). Appl. Sin. (Chin. Ser.)

  24. Jian, J.B., Liu, P.J., Jiang, X.Z., He, B.: Two improved nonlinear conjugate gradient methods with the strong Wolfe line search. Bull. Iran. Math. Soc. (2021). https://doi.org/10.1007/s41980-021-00647-y

    Article  MATH  Google Scholar 

  25. Jiang, X.Z., Jian, J.B.: Two modified nonlinear conjugate gradient methods with disturbance factors for unconstrained optimization. Nonlinear Dyn. 77(1–2), 387–397 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  26. Jiang, X.Z., Jian, J.B.: Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search. J. Comput. Appl. Math. 348, 525–534 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  27. Jiang, X.Z., Jian, J.B., Song, D., Liu, P.J.: An improved Polak–Ribière–Polyak conjugate gradient method with an efficient restart direction. Comput. Appl. Math. 40, 174 (2021)

    Article  MATH  Google Scholar 

  28. Kou, C.X., Dai, Y.H.: A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. J. Optim. Theory Appl. 165(1), 209–224 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  29. Morè, J., Garbow, B.S., Hillstrome, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  30. Polak, B., Ribir̀e, G.: Note surla convergence des mèthodes de directions conjugèes. Rev. Fr. Inf. Rech. Oper. 3(1), 35–43 (1969)

  31. Tang, C., Li, S., Cui, Z.: Least-squares-based three-term conjugate gradient methods. J. Inequal. Appl. 27, 1–22 (2020)

    MathSciNet  MATH  Google Scholar 

  32. Zhang, L.: An improved Wei–Yao–Liu nonlinear conjugate gradient method for optimization computation. Appl. Math. Comput. 215(6), 2269–2274 (2009)

    MathSciNet  MATH  Google Scholar 

  33. Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher–Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104(4), 561–572 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  34. Zhang, L., Zhou, W., Li, D.: Some descent three-term conjugate gradient methods and their global convergence. Optim. Methods Softw. 22(4), 697–711 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  35. Zhu, Z.B., Ma, J.Y., Zhang, B.X.: A new conjugate gradient hard thresholding pursuit algorithm for sparse signal recovery. Comput. Appl. Math. 39, 270 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  36. Zhu, Z.B., Zhang, D.D., Wang, S.: Two modified DY conjugate gradient methods for unconstrained optimization problems. Appl. Math. Comput. 373(15), 125004 (2020)

    MathSciNet  MATH  Google Scholar 

  37. Zoutendijk, G.: Nonlinear Programming, Computational Methods. In: Abadie, J. (ed.) Integer and Nonlinear Programming, pp. 37–86. North-Holland, Amsterdam (1970)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xianzhen Jiang.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work is supported in part by the National Natural Science Foundation of China (No. 12171106), and in part by the Natural Science Foundation of Guangxi Province (Nos. 2020GXNSFDA238017, 2018GXNSFFA281007), and in part by Research Project of Guangxi University for Nationalities (No. 2018KJQD02).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jian, J., Liu, P., Jiang, X. et al. Two classes of spectral conjugate gradient methods for unconstrained optimizations. J. Appl. Math. Comput. 68, 4435–4456 (2022). https://doi.org/10.1007/s12190-022-01713-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12190-022-01713-2

Keywords

Mathematics Subject Classification

Navigation