Abstract
Conjugate gradient method is one of the most efficient methods for large-scale unconstrained optimization and has attracted focused attention of researchers. With the emergence of more and more large-scale problems, subspace technique becomes particularly important. Recently, many scholars have studied subspace minimization conjugate gradient methods, where the iteration direction is generally obtained by minimizing a quadratic approximate model in a specific subspace. Considering that the conic model contains more information than the quadratic model, and may perform better when the objective function is not close to quadratic, in this paper, a improved conjugate gradient method is presented. Specially, in each iteration, a quadratic or conic model is dynamically selected. The search direction is obtained by minimizing the selected model in a three-dimensional subspace spanned by the current gradient and latest two search directions. It is proved that the search direction satisfies the sufficient descent condition under given conditions. With the modified nonmonotone Wolfe line search, we establish the global convergence and the R-linear convergence of the proposed method under mild assumptions. Numerical experiments indicate that the proposed method is efficient.
















Similar content being viewed by others
References
Andrei N (2008) An unconstrained optimization test functions collection. Adv Model Optim 10(1):147–161
Barzilai J, Borwein JM (1988) Two-point step size gradient methods. IMA J Numer Anal 8:141–148
Dai YH (2011) Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science. https://doi.org/10.1002/9780470400531.eorms0183
Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320
Dai YH, Kou CX (2016) A Barzilai-Borwein conjugate gradient method. Sci China Math 59(8):1511–1524
Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182
Davidon WC (1980) Conic approximations and collinear scalings for optimizers. SIAM J Numer Anal 17(2):268–281
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213
Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154
Gould NIM, Orban D, Toint PL (2003) CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394
Gourgeon H, Nocedal J (1985) A Conic Algorithm for Optimization. SIAM J Sci Statist Comput 6:253–267
Grandinetti L (1984) Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function. J Optim Theory Appl 43(1):1–21
Hager WW, Zhang H (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192
Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58
Hager WW, Zhang H (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137
Hestenes MR, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49:409–436
Li YF, Liu ZX, Liu HW (2019) A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput Appl Math 38(1)
Li M, Liu HW, Liu ZX A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numerical Algorithms. https://doi.org/10.1007/s11075-017-0434-6
Liu ZX, Liu HW (2018) An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer Algorithms 78(1):21–39
Liu ZX, Liu HW (2018) Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J Comput Appl Math 328:400–413
Polak E, Ribière G (1969) Note sur la convergence de méthodes de directions conjuguées. Rev Franaise Informat Rech Opérationnelle 3(16):35–43
Polyak BT (1969) The conjugate gradient method in extremal problems. USSR Comput Math Math Phys 9(4):94–112
Schnabel RB (1982) Conic methods for unconstrained minimization and tensor methods for nonlinear equation. Math Program 417–438
Sorensen DC (1980) The Q-Superlinear convergence of a collinear scaling algorithm for unconstrained optimization. SIAM J Numer Anal 17(1):84–114
Sun WY (1996) On nonquadratic model optimization methods. Asia Pac J Oper Res 13:43–63
Yuan YX (2014) A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics. Korea, 807-827
Yuan YX (1991) A modified BFGS algorithm for unconstrained optimization. IMA J Numer Anal 11(3):325–332
Yuan YX, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75(1):69–77
Yuan YX, Sun WY (1997) Optimization Theory and Methods. Science Press, Beijing
Zhang H, Hager WW (2004) A nonmonotone line search technique and its application to unconstrained optimization. SIAM J Optim 14(4):1043–1056
Acknowledgements
We would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper. We also would like to thank Professor Dai, Y.H. and Dr. Kou, C.X. for their CGOPT code, and thank Professor Hager, W.W. and Zhang, H.C. for their CG_DESCENT(5.3) code.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Eduardo Souza de Cursi.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Sun, W., Li, Y., Wang, T. et al. A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization. Comp. Appl. Math. 41, 178 (2022). https://doi.org/10.1007/s40314-022-01885-4
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-022-01885-4