Skip to main content
Log in

Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

In this paper, four accelerated subspace minimization conjugate gradient methods based on 2-regularization or 3-regularization models with different matrices are proposed. The search directions are generated by minimizing a quadratic approximation and different regularization models of the objective function, and the new directions satisfy the sufficient descent condition. In addition, an acceleration technique is used to improve the stepsize. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. By using the Kurdyka-Łojasiewicz property, we analyze the convergence rates of function sequence and iterative sequence without the convexity of objective function. Numerical experiments show that the proposed methods are very effective, especially for CUTEr test problems with no less than 50 dimensions, the best of the new methods is competitive to the latest limited memory conjugate gradient software package CG_DESCENT(6.8) developed by Hager and Zhang (SIAM J. Optim. 23:2150–2168, 24).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Andrea, C., Tayebeh, D.N., Stefano, L.: On global minimizers of quadratic functions with cubic regularization. Optim. Lett. 13, 1269–1283 (2019)

    Article  MATH  MathSciNet  Google Scholar 

  2. Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms. 65, 859–874 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  3. Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)

    Article  MATH  MathSciNet  Google Scholar 

  4. Bellavia, S., Morini, B., Cartis, C., Gould, N.I.M., Toint, Ph.L.: Convergence of a regularizedeuclidean residual algorithm for nonlinear least-squares. SIAM J. Numer. Anal. 48, 1–29 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  5. Bellavia, S., Morini, B.: Strong local convergence properties of adaptive regularized methodsfor nonlinear least squares. IMA J. Numer. Anal. 35, 947–968 (2014)

    Article  MATH  Google Scholar 

  6. Bianconcini, T., Liuzzi, G., Morini, B., Sciandrone, M.: On the use of iterative methods in cubicregularization for unconstrained optimization. Comput. Optim. Appl. 60, 35–57 (2015)

    Article  MATH  MathSciNet  Google Scholar 

  7. Birgin, E.G., Martínez, J.M.: A newton-like method with mixed factorizations and cubic regularization for unconstrained minimization. Comput. Optim. Appl. 73, 707–753 (2019)

    Article  MATH  MathSciNet  Google Scholar 

  8. Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146(1-2), 459–494 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  9. Cartis, C., Gould, N.I.M., Toint, Ph.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results. Math. Program. 127, 127245–295 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  10. Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrainedoptimization. Part II: worst-case function-and derivative-evaluation complexity. Math. Program. 130, 295–319 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  11. Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  12. Dai, Y.H.: Nonlinear conjugate gradient methods wiley encyclopedia of operations research and management science. https://doi.org/10.1002/9780470400531.eorms0183(2011)

  13. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  14. Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)

    Article  MATH  MathSciNet  Google Scholar 

  15. Dai, Y. H., Yuan, Y. X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  16. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  17. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MATH  MathSciNet  Google Scholar 

  18. Gould, N.I.M., Orban, D., Toint, Ph.L: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)

    Article  MATH  Google Scholar 

  19. Gould, N.I.M., Porcelli, M., Toint, Ph.L: Updating the regularization parameter in the adaptivecubic regularization algorithm. Comput. Optim. Appl. 53, 1–22 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  20. Griewank, A.: The modification of newton’s method for unconstrained optimization by bounding cubic terms. Technical report NA/12, department of applied mathematics and theoretical physics university of cambridge (1981)

  21. Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  22. Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MATH  MathSciNet  Google Scholar 

  23. Hager, W.W., Zhang, H.C.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)

    Article  MATH  Google Scholar 

  24. Hager, W.W., Zhang, H.C.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  25. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)

    Article  MATH  MathSciNet  Google Scholar 

  26. Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer. Algorithms 79, 195–219 (2018)

    Article  MATH  MathSciNet  Google Scholar 

  27. Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math., 38(1) (2019)

  28. Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)

    Article  MATH  MathSciNet  Google Scholar 

  29. Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)

    Article  MATH  MathSciNet  Google Scholar 

  30. Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  31. Lu, H., Freund, R.M., Nesterov, Y.: Relatively smooth convex optimization by first-order methods, and applications. SIAM J. Optim. 28, 333–354 (2018)

    Article  MATH  MathSciNet  Google Scholar 

  32. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)

    Book  MATH  Google Scholar 

  33. Polak, E., Ribière, G: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle 3(16), 35–43 (1969)

    MATH  Google Scholar 

  34. Polyak, B.T.: The conjugate gradient method in extremal problems. Ussr Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  35. Sun, W.M., Liu, H.W., Liu, Z.X.: A class of accelerated subspace minimization conjugate gradient methods. J. Optim. Theory Appl. 190(3), 811–840 (2021)

    Article  MATH  MathSciNet  Google Scholar 

  36. Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)

    Google Scholar 

  37. Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)

    Article  MATH  MathSciNet  Google Scholar 

  38. Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. numer. Algorithms 76, 813–828 (2017)

    Article  MATH  MathSciNet  Google Scholar 

  39. Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  40. Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  41. Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)

    Google Scholar 

  42. Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  43. Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algorithms. 87, 1501–1534 (2021)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Funding

This research was supported by the National Natural Science Foundation of China (No. 11901561), Guangxi Natural Science Foundation (No. 2018GXNSFBA281180) and Natural Science Basic Research Program of Shaanxi (No. 2021JM-396).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Data availability

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, W., Liu, H. & Liu, Z. Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems. Numer Algor 91, 1677–1719 (2022). https://doi.org/10.1007/s11075-022-01319-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-022-01319-6

Keywords

Mathematics Subject Classification (2010)

Navigation