Skip to main content
Log in

A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

Conjugate gradient method is one of the most efficient methods for large-scale unconstrained optimization and has attracted focused attention of researchers. With the emergence of more and more large-scale problems, subspace technique becomes particularly important. Recently, many scholars have studied subspace minimization conjugate gradient methods, where the iteration direction is generally obtained by minimizing a quadratic approximate model in a specific subspace. Considering that the conic model contains more information than the quadratic model, and may perform better when the objective function is not close to quadratic, in this paper, a improved conjugate gradient method is presented. Specially, in each iteration, a quadratic or conic model is dynamically selected. The search direction is obtained by minimizing the selected model in a three-dimensional subspace spanned by the current gradient and latest two search directions. It is proved that the search direction satisfies the sufficient descent condition under given conditions. With the modified nonmonotone Wolfe line search, we establish the global convergence and the R-linear convergence of the proposed method under mild assumptions. Numerical experiments indicate that the proposed method is efficient.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  • Andrei N (2008) An unconstrained optimization test functions collection. Adv Model Optim 10(1):147–161

    MathSciNet  MATH  Google Scholar 

  • Barzilai J, Borwein JM (1988) Two-point step size gradient methods. IMA J Numer Anal 8:141–148

    Article  MathSciNet  Google Scholar 

  • Dai YH (2011) Nonlinear Conjugate Gradient Methods. Wiley Encyclopedia of Operations Research and Management Science. https://doi.org/10.1002/9780470400531.eorms0183

  • Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320

    Article  MathSciNet  Google Scholar 

  • Dai YH, Kou CX (2016) A Barzilai-Borwein conjugate gradient method. Sci China Math 59(8):1511–1524

    Article  MathSciNet  Google Scholar 

  • Dai YH, Yuan Y (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182

    Article  MathSciNet  Google Scholar 

  • Davidon WC (1980) Conic approximations and collinear scalings for optimizers. SIAM J Numer Anal 17(2):268–281

    Article  MathSciNet  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91:201–213

    Article  MathSciNet  Google Scholar 

  • Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7:149–154

    Article  MathSciNet  Google Scholar 

  • Gould NIM, Orban D, Toint PL (2003) CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394

    Article  Google Scholar 

  • Gourgeon H, Nocedal J (1985) A Conic Algorithm for Optimization. SIAM J Sci Statist Comput 6:253–267

    Article  MathSciNet  Google Scholar 

  • Grandinetti L (1984) Some investigations in a new algorithm for nonlinear optimization based on conic models of the objective function. J Optim Theory Appl 43(1):1–21

    Article  MathSciNet  Google Scholar 

  • Hager WW, Zhang H (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192

    Article  MathSciNet  Google Scholar 

  • Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58

    MathSciNet  MATH  Google Scholar 

  • Hager WW, Zhang H (2006) Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137

    Article  Google Scholar 

  • Hestenes MR, Stiefel E (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49:409–436

    Article  MathSciNet  Google Scholar 

  • Li YF, Liu ZX, Liu HW (2019) A subspace minimization conjugate gradient method based on conic model for unconstrained optimization. Comput Appl Math 38(1)

  • Li M, Liu HW, Liu ZX A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numerical Algorithms. https://doi.org/10.1007/s11075-017-0434-6

  • Liu ZX, Liu HW (2018) An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer Algorithms 78(1):21–39

    Article  MathSciNet  Google Scholar 

  • Liu ZX, Liu HW (2018) Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J Comput Appl Math 328:400–413

    Article  MathSciNet  Google Scholar 

  • Polak E, Ribière G (1969) Note sur la convergence de méthodes de directions conjuguées. Rev Franaise Informat Rech Opérationnelle 3(16):35–43

    MATH  Google Scholar 

  • Polyak BT (1969) The conjugate gradient method in extremal problems. USSR Comput Math Math Phys 9(4):94–112

    Article  Google Scholar 

  • Schnabel RB (1982) Conic methods for unconstrained minimization and tensor methods for nonlinear equation. Math Program 417–438

  • Sorensen DC (1980) The Q-Superlinear convergence of a collinear scaling algorithm for unconstrained optimization. SIAM J Numer Anal 17(1):84–114

    Article  MathSciNet  Google Scholar 

  • Sun WY (1996) On nonquadratic model optimization methods. Asia Pac J Oper Res 13:43–63

    Google Scholar 

  • Yuan YX (2014) A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics. Korea, 807-827

  • Yuan YX (1991) A modified BFGS algorithm for unconstrained optimization. IMA J Numer Anal 11(3):325–332

    Article  MathSciNet  Google Scholar 

  • Yuan YX, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75(1):69–77

    Article  MathSciNet  Google Scholar 

  • Yuan YX, Sun WY (1997) Optimization Theory and Methods. Science Press, Beijing

    Google Scholar 

  • Zhang H, Hager WW (2004) A nonmonotone line search technique and its application to unconstrained optimization. SIAM J Optim 14(4):1043–1056

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We would like to thank the editor and the anonymous referees for their valuable suggestions and comments which have greatly improved the presentation of this paper. We also would like to thank Professor Dai, Y.H. and Dr. Kou, C.X. for their CGOPT code, and thank Professor Hager, W.W. and Zhang, H.C. for their CG_DESCENT(5.3) code.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Additional information

Communicated by Eduardo Souza de Cursi.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, W., Li, Y., Wang, T. et al. A new subspace minimization conjugate gradient method based on conic model for large-scale unconstrained optimization. Comp. Appl. Math. 41, 178 (2022). https://doi.org/10.1007/s40314-022-01885-4

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-022-01885-4

Keywords

Mathematics Subject Classification

Navigation