Skip to main content
Log in

A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

In this paper, a new subspace minimization conjugate gradient method based on modified secant equation is proposed and analyzed. For a classical subspace minimization conjugate gradient method, the search direction is derived by minimizing an approximate quadratic model of objective function in a two-dimensional subspace. Generally, the approximate Hessian matrix in the above quadratic model is required to satisfy the standard secant equation, while we consider an approximate Hessian matrix which satisfies the modified secant equation. We give some rules such that if these rules are satisfied, we choose the standard secant equation, otherwise we choose the modified one. We can prove that the proposed directions satisfy the sufficient descent property under some extra conditions. We also present a modified nonmonotone Wolfe line search and establish the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with famous CG_DESCENT (5.3) (Hager and Zhang in SIAM J Optim 16(1):170–192, 2005) and SMCG_BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), and show that the proposed algorithm is very promising.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  • Andrei N (2008) An unconstrained optimization test functions collection. Adv Model Optim 10:147–161

    MathSciNet  MATH  Google Scholar 

  • Andrei N (2014) An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer Algorithms 65(4):859–874

    Article  MathSciNet  Google Scholar 

  • Babaie-Kafaki S (2014) An adaptive conjugacy condition and related nonlinear conjugate gradient methods. Int J Comput Methods 11(4):1350092

    Article  MathSciNet  Google Scholar 

  • Babaie-Kafaki S, Reza G (2014) The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630

    Article  MathSciNet  Google Scholar 

  • Barzilai J, Borwein JM (1988) Two-point step size gradient methods. IAM J Numer Anal 8(1):141–148

    Article  MathSciNet  Google Scholar 

  • Bellavia S, Morini B (2006) Subspace trust-region methods for large bound-constrained nonlinear equations. SIAM J Numer Anal 44(4):1535–1555

    Article  MathSciNet  Google Scholar 

  • Biglari F, Hassan MA, Leong WJ (2011) New quasi-Newton methods via higher order tensor models. J Comput Appl Math 235(8):2412–2422

    Article  MathSciNet  Google Scholar 

  • Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320

    Article  MathSciNet  Google Scholar 

  • Dai YH, Kou CX (2016) A Barzilai–Borwein conjugate gradient method. Sci China Math 59(8):1511–1524

    Article  MathSciNet  Google Scholar 

  • Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101

    Article  MathSciNet  Google Scholar 

  • Dai YH, Yuan YX (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182

    Article  MathSciNet  Google Scholar 

  • Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2):201–213

    Article  MathSciNet  Google Scholar 

  • Erway JB, Gill PE (2009) A subspace minimization method for the trust-region step. SIAM J Optim 20(3):1439–1461

    Article  MathSciNet  Google Scholar 

  • Fatemi M (2016) A new efficient conjugate gradient method for uncontrained optimization. J Comput Appl Math 300:207–216

    Article  MathSciNet  Google Scholar 

  • Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7(2):149–154

    Article  MathSciNet  Google Scholar 

  • Gould NIM, Orban D, Toint PhL (2003) CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394

    Article  Google Scholar 

  • Hager WW, Zhang HC (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192

    Article  MathSciNet  Google Scholar 

  • Hager WW, Zhang HC (2013) The limited memory conjugate gradient method. SIAM J Optim 23(4):2150–2168

    Article  MathSciNet  Google Scholar 

  • Hestenes MR, Stiefel EL (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49(6):409–436

    Article  MathSciNet  Google Scholar 

  • Li M, Liu HW, Liu ZX (2018) A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer Algorithms 79(1):195–219

    Article  MathSciNet  Google Scholar 

  • Liu ZX, Liu HW (2018) An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer Algorithms 78(1):21–39

    Article  MathSciNet  Google Scholar 

  • Liu HW, Liu ZX (2019) An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J Optim Theory Appl 180(3):879–906

    Article  MathSciNet  Google Scholar 

  • Nocedal J, Wright SJ (1999) Numerical optimization. Springer, New York

    Book  Google Scholar 

  • Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comput Math Math Phys 9(4):94–112

    Article  Google Scholar 

  • Rivaie M, Mamat M, Abashar A (2015) A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Appl Math Comput 268:1152–1163

    MathSciNet  MATH  Google Scholar 

  • Tarzanagh DA, Peyghami MR (2015) A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems. J Glob Optim 63(4):709–728

    Article  MathSciNet  Google Scholar 

  • Wei HJ, Yang WH (2016) A Riemannian subspace limited-memory SR1 trust-region method. Optim Lett 10(8):1705–1723

    Article  MathSciNet  Google Scholar 

  • Wei Z, Li G, Qi L (2006) New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput 175(2):1156–1188

    MathSciNet  MATH  Google Scholar 

  • Yuan YX (2014) A review on subspace methods for nonlinear optimization. In: Proceedings of the international congress of mathematics, Seoul, pp 807–827

  • Yuan YX (1991) A modified BFGS algorithm for unconstrained optimization. IAM J Numer Anal 11(3):325–332

    Article  MathSciNet  Google Scholar 

  • Yuan YX (2009) Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim Eng 10(2):207–218

    Article  MathSciNet  Google Scholar 

  • Yuan YX, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75(1):69–77

    Article  MathSciNet  Google Scholar 

  • Zhang JZ, Xu CX (2001) Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J Comput Appl Math 137(2):269–278

    Article  MathSciNet  Google Scholar 

  • Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102(1):147–167

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

We would like to thank Pofessors Hager,W.W. and Zhang, H.C. for their CG_DESCENT code. This research is supported by National Science Foundation of China (No. 11901561), GuangXi Science Foundation (No. 2018GXNSFBA281180) and China Postdoctoral Science Foundation (2019M660833).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hongwei Liu.

Additional information

Communicated by Paulo J. S. Silva.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Diao, X., Liu, H. & Liu, Z. A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization. Comp. Appl. Math. 39, 251 (2020). https://doi.org/10.1007/s40314-020-01301-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-020-01301-9

Keywords

Mathematics Subject Classification

Navigation