Abstract
In this paper, a new subspace minimization conjugate gradient method based on modified secant equation is proposed and analyzed. For a classical subspace minimization conjugate gradient method, the search direction is derived by minimizing an approximate quadratic model of objective function in a two-dimensional subspace. Generally, the approximate Hessian matrix in the above quadratic model is required to satisfy the standard secant equation, while we consider an approximate Hessian matrix which satisfies the modified secant equation. We give some rules such that if these rules are satisfied, we choose the standard secant equation, otherwise we choose the modified one. We can prove that the proposed directions satisfy the sufficient descent property under some extra conditions. We also present a modified nonmonotone Wolfe line search and establish the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with famous CG_DESCENT (5.3) (Hager and Zhang in SIAM J Optim 16(1):170–192, 2005) and SMCG_BB (Liu and Liu in J Optim Theory Appl 180(3):879–906, 2019), and show that the proposed algorithm is very promising.








Similar content being viewed by others
References
Andrei N (2008) An unconstrained optimization test functions collection. Adv Model Optim 10:147–161
Andrei N (2014) An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer Algorithms 65(4):859–874
Babaie-Kafaki S (2014) An adaptive conjugacy condition and related nonlinear conjugate gradient methods. Int J Comput Methods 11(4):1350092
Babaie-Kafaki S, Reza G (2014) The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630
Barzilai J, Borwein JM (1988) Two-point step size gradient methods. IAM J Numer Anal 8(1):141–148
Bellavia S, Morini B (2006) Subspace trust-region methods for large bound-constrained nonlinear equations. SIAM J Numer Anal 44(4):1535–1555
Biglari F, Hassan MA, Leong WJ (2011) New quasi-Newton methods via higher order tensor models. J Comput Appl Math 235(8):2412–2422
Dai YH, Kou CX (2013) A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J Optim 23(1):296–320
Dai YH, Kou CX (2016) A Barzilai–Borwein conjugate gradient method. Sci China Math 59(8):1511–1524
Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101
Dai YH, Yuan YX (1999) A nonlinear conjugate gradient method with a strong global convergence property. SIAM J Optim 10(1):177–182
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2):201–213
Erway JB, Gill PE (2009) A subspace minimization method for the trust-region step. SIAM J Optim 20(3):1439–1461
Fatemi M (2016) A new efficient conjugate gradient method for uncontrained optimization. J Comput Appl Math 300:207–216
Fletcher R, Reeves CM (1964) Function minimization by conjugate gradients. Comput J 7(2):149–154
Gould NIM, Orban D, Toint PhL (2003) CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394
Hager WW, Zhang HC (2005) A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J Optim 16(1):170–192
Hager WW, Zhang HC (2013) The limited memory conjugate gradient method. SIAM J Optim 23(4):2150–2168
Hestenes MR, Stiefel EL (1952) Methods of conjugate gradients for solving linear systems. J Res Natl Bur Stand 49(6):409–436
Li M, Liu HW, Liu ZX (2018) A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer Algorithms 79(1):195–219
Liu ZX, Liu HW (2018) An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer Algorithms 78(1):21–39
Liu HW, Liu ZX (2019) An efficient Barzilai–Borwein conjugate gradient method for unconstrained optimization. J Optim Theory Appl 180(3):879–906
Nocedal J, Wright SJ (1999) Numerical optimization. Springer, New York
Polyak BT (1969) The conjugate gradient method in extreme problems. USSR Comput Math Math Phys 9(4):94–112
Rivaie M, Mamat M, Abashar A (2015) A new class of nonlinear conjugate gradient coefficients with exact and inexact line searches. Appl Math Comput 268:1152–1163
Tarzanagh DA, Peyghami MR (2015) A new regularized limited memory BFGS-type method based on modified secant conditions for unconstrained optimization problems. J Glob Optim 63(4):709–728
Wei HJ, Yang WH (2016) A Riemannian subspace limited-memory SR1 trust-region method. Optim Lett 10(8):1705–1723
Wei Z, Li G, Qi L (2006) New quasi-Newton methods for unconstrained optimization problems. Appl Math Comput 175(2):1156–1188
Yuan YX (2014) A review on subspace methods for nonlinear optimization. In: Proceedings of the international congress of mathematics, Seoul, pp 807–827
Yuan YX (1991) A modified BFGS algorithm for unconstrained optimization. IAM J Numer Anal 11(3):325–332
Yuan YX (2009) Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim Eng 10(2):207–218
Yuan YX, Stoer J (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75(1):69–77
Zhang JZ, Xu CX (2001) Properties and numerical performance of quasi-Newton methods with modified quasi-Newton equations. J Comput Appl Math 137(2):269–278
Zhang JZ, Deng NY, Chen LH (1999) New quasi-Newton equation and related methods for unconstrained optimization. J Optim Theory Appl 102(1):147–167
Acknowledgements
We would like to thank Pofessors Hager,W.W. and Zhang, H.C. for their CG_DESCENT code. This research is supported by National Science Foundation of China (No. 11901561), GuangXi Science Foundation (No. 2018GXNSFBA281180) and China Postdoctoral Science Foundation (2019M660833).
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Paulo J. S. Silva.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Diao, X., Liu, H. & Liu, Z. A new subspace minimization conjugate gradient method based on modified secant equation for unconstrained optimization. Comp. Appl. Math. 39, 251 (2020). https://doi.org/10.1007/s40314-020-01301-9
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-020-01301-9
Keywords
- Subspace minimization
- Conjugate gradient method
- Modified secant equation
- Modified nonmonotone Wolfe line search
- Global convergence