Skip to main content
Log in

A Class of Accelerated Subspace Minimization Conjugate Gradient Methods

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

The subspace minimization conjugate gradient methods based on Barzilai–Borwein method (SMCG_BB) and regularization model (SMCG_PR), which were proposed by Liu and Liu (J Optim Theory Appl 180(3):879–906, 2019) and Zhao et al. (Numer Algorithm, 2020. https://doi.org/10.1007/s11075-020-01017-1), respectively, are very interesting and efficient for unconstrained optimization. In this paper, two accelerated subspace minimization conjugate gradient methods are proposed for unconstrained optimization. Motivated by the subspace minimization conjugate gradient methods and the finite termination of linear conjugate gradient methods, we derive an acceleration parameter by the quadratic interpolation function to improve the stepsize, and the modified stepsize may be more closer to the stepsize obtained by exact line search. Moreover, several specific acceleration criteria to enhance the efficiency of the algorithm are designed. Under standard conditions, the global convergence of the proposed methods can be guaranteed. Numerical results show that the proposed accelerated methods are superior to two excellent subspace minimization conjugate gradient methods SMCG_BB and SMCG_PR.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm 68(2), 305–321 (2015)

    Article  MathSciNet  Google Scholar 

  2. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm 65, 859–874 (2014)

    Article  MathSciNet  Google Scholar 

  3. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  4. Andrei, N.: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization. Springer, Berlin (2020)

    Book  Google Scholar 

  5. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Sci. Soc. 34(2), 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  6. Andrei, N.: Accelerated conjugate gradient algorithm with modified secant condition for unconstrained optimization. Stud. Inform. Control 18(3), 211–232 (2009)

    Google Scholar 

  7. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213(2), 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  8. Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algorithm 42(1), 63–73 (2006)

    Article  MathSciNet  Google Scholar 

  9. Andrei, N.: Diagonal approximation of the Hessian by finite differences for unconstrained optimization. J. Optim. Theory Appl. 185(3), 859–879 (2020)

    Article  MathSciNet  Google Scholar 

  10. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer Anal. 8, 141–148 (1988)

    Article  MathSciNet  Google Scholar 

  11. Dai, Y.H., Kou, C.X.: A Barzilai–Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)

    Article  MathSciNet  Google Scholar 

  12. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  13. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)

    Article  MathSciNet  Google Scholar 

  14. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  Google Scholar 

  15. Dai, Y.H.: Nonlinear conjugate gradient methods. Wiley Encycl. Oper. Res. Manag. Sci. (2011). https://doi.org/10.1002/9780470400531.eorms0183

    Article  Google Scholar 

  16. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  17. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  18. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  19. Hager, W.W., Zhang, H.: Algorithm 851: CG\_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  Google Scholar 

  20. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  21. Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer. Algorithm 79, 195–219 (2018)

    Article  MathSciNet  Google Scholar 

  22. Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math. 38(1), 16 (2019)

    Article  MathSciNet  Google Scholar 

  23. Liu, H.W., Liu, Z.X.: An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180(3), 879–906 (2019)

    Article  MathSciNet  Google Scholar 

  24. Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)

    Article  MathSciNet  Google Scholar 

  25. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  Google Scholar 

  26. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle. 3(16), 35–43 (1969)

    MATH  Google Scholar 

  27. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  Google Scholar 

  28. Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)

    Google Scholar 

  29. Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)

    Article  MathSciNet  Google Scholar 

  30. Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numer. Algorithm 76, 813–828 (2017)

    Article  MathSciNet  Google Scholar 

  31. Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)

    Article  MathSciNet  Google Scholar 

  32. Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)

    Google Scholar 

  33. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MathSciNet  Google Scholar 

  34. Yuan, Y.X.: A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics. Korea, pp. 807–827 (2014)

  35. Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algorithm (2020). https://doi.org/10.1007/s11075-020-01017-1

    Article  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper. This research is supported by the National Natural Science Foundation of China (No. 11901561) and Guangxi Natural Science Foundation (No. 2018GXNSFBA281180).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Additional information

Communicated by Nobuo Yamashita.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, W., Liu, H. & Liu, Z. A Class of Accelerated Subspace Minimization Conjugate Gradient Methods. J Optim Theory Appl 190, 811–840 (2021). https://doi.org/10.1007/s10957-021-01897-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-021-01897-w

Keywords

Mathematics Subject Classification

Navigation