Abstract
In this paper, four accelerated subspace minimization conjugate gradient methods based on 2-regularization or 3-regularization models with different matrices are proposed. The search directions are generated by minimizing a quadratic approximation and different regularization models of the objective function, and the new directions satisfy the sufficient descent condition. In addition, an acceleration technique is used to improve the stepsize. With a modified nonmonotone line search, we establish the global convergence of the proposed methods under mild assumptions. By using the Kurdyka-Łojasiewicz property, we analyze the convergence rates of function sequence and iterative sequence without the convexity of objective function. Numerical experiments show that the proposed methods are very effective, especially for CUTEr test problems with no less than 50 dimensions, the best of the new methods is competitive to the latest limited memory conjugate gradient software package CG_DESCENT(6.8) developed by Hager and Zhang (SIAM J. Optim. 23:2150–2168, 24).












Similar content being viewed by others
References
Andrea, C., Tayebeh, D.N., Stefano, L.: On global minimizers of quadratic functions with cubic regularization. Optim. Lett. 13, 1269–1283 (2019)
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithms. 65, 859–874 (2014)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Bellavia, S., Morini, B., Cartis, C., Gould, N.I.M., Toint, Ph.L.: Convergence of a regularizedeuclidean residual algorithm for nonlinear least-squares. SIAM J. Numer. Anal. 48, 1–29 (2010)
Bellavia, S., Morini, B.: Strong local convergence properties of adaptive regularized methodsfor nonlinear least squares. IMA J. Numer. Anal. 35, 947–968 (2014)
Bianconcini, T., Liuzzi, G., Morini, B., Sciandrone, M.: On the use of iterative methods in cubicregularization for unconstrained optimization. Comput. Optim. Appl. 60, 35–57 (2015)
Birgin, E.G., Martínez, J.M.: A newton-like method with mixed factorizations and cubic regularization for unconstrained minimization. Comput. Optim. Appl. 73, 707–753 (2019)
Bolte, J., Sabach, S., Teboulle, M.: Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math. Program. 146(1-2), 459–494 (2014)
Cartis, C., Gould, N.I.M., Toint, Ph.L.: Adaptive cubic regularisation methods for unconstrained optimization. Part I: motivation, convergence and numerical results. Math. Program. 127, 127245–295 (2011)
Cartis, C., Gould, N.I.M., Toint, P.L.: Adaptive cubic regularisation methods for unconstrainedoptimization. Part II: worst-case function-and derivative-evaluation complexity. Math. Program. 130, 295–319 (2011)
Dai, Y.H., Yuan, J.Y., Yuan, Y.X.: Modified two-point stepsize gradient methods for unconstrained optimization problems. Comput. Optim. Appl. 22(1), 103–109 (2002)
Dai, Y.H.: Nonlinear conjugate gradient methods wiley encyclopedia of operations research and management science. https://doi.org/10.1002/9780470400531.eorms0183(2011)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23(1), 296–320 (2013)
Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci. China Math. 59(8), 1511–1524 (2016)
Dai, Y. H., Yuan, Y. X.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gould, N.I.M., Orban, D., Toint, Ph.L: CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29, 373–394 (2003)
Gould, N.I.M., Porcelli, M., Toint, Ph.L: Updating the regularization parameter in the adaptivecubic regularization algorithm. Comput. Optim. Appl. 53, 1–22 (2012)
Griewank, A.: The modification of newton’s method for unconstrained optimization by bounding cubic terms. Technical report NA/12, department of applied mathematics and theoretical physics university of cambridge (1981)
Hager, W.W., Zhang, H.C.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)
Hager, W.W., Zhang, H.C.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)
Hager, W.W., Zhang, H.C.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32(1), 113–137 (2006)
Hager, W.W., Zhang, H.C.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, M., Liu, H.W., Liu, Z.X.: A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer. Algorithms 79, 195–219 (2018)
Li, Y.F., Liu, Z.X., Liu, H.W.: A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput. Appl. Math., 38(1) (2019)
Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms 78(1), 21–39 (2018)
Liu, H.W., Liu, Z.X.: An efficient Barzilai-Borwein conjugate gradient method for unconstrained optimization. J. Optim. Theory Appl. 180, 879–906 (2019)
Liu, D.C., Nocedal, J.: On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503–528 (1989)
Lu, H., Freund, R.M., Nesterov, Y.: Relatively smooth convex optimization by first-order methods, and applications. SIAM J. Optim. 28, 333–354 (2018)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (1999)
Polak, E., Ribière, G: Note sur la convergence de méthodes de directions conjuguées. Rev. Franaise Informat. Rech. Opérationnelle 3(16), 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. Ussr Comput. Math. Math. Phys. 9(4), 94–112 (1969)
Sun, W.M., Liu, H.W., Liu, Z.X.: A class of accelerated subspace minimization conjugate gradient methods. J. Optim. Theory Appl. 190(3), 811–840 (2021)
Sun, W.Y.: On nonquadratic model optimization methods. Asia Pac. J. Oper. Res. 13, 43–63 (1996)
Wang, T., Liu, Z.X., Liu, H.W.: A new subspace minimization conjugate gradient method based on tensor model for unconstrained optimization. Int. J. Comput. Math. 96(10), 1924–1942 (2019)
Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. numer. Algorithms 76, 813–828 (2017)
Yuan, Y.X.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11(3), 325–332 (1991)
Yuan, Y.X., Stoer, J.: A subspace study on conjugate gradient algorithms. Z. Angew. Math. Mech. 75(1), 69–77 (1995)
Yuan, Y.X., Sun, W.Y.: Optimization Theory and Methods. Science Press, Beijing (1997)
Zhang, H.C., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)
Zhao, T., Liu, H.W., Liu, Z.X.: New subspace minimization conjugate gradient methods based on regularization model for unconstrained optimization. Numer. Algorithms. 87, 1501–1534 (2021)
Funding
This research was supported by the National Natural Science Foundation of China (No. 11901561), Guangxi Natural Science Foundation (No. 2018GXNSFBA281180) and Natural Science Basic Research Program of Shaanxi (No. 2021JM-396).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no competing interests.
Additional information
Data availability
Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sun, W., Liu, H. & Liu, Z. Several accelerated subspace minimization conjugate gradient methods based on regularization model and convergence rate analysis for nonconvex problems. Numer Algor 91, 1677–1719 (2022). https://doi.org/10.1007/s11075-022-01319-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-022-01319-6
Keywords
- Conjugate gradient method
- Regularization model
- Subspace minimization
- Kurdyka-Łojasiewicz property
- Convergence