Abstract
Minimizing the distance between search direction matrix of the Dai–Liao method and the scaled memoryless BFGS update in the Frobenius norm, and using Powell’s nonnegative restriction of the conjugate gradient parameters, a one-parameter class of nonlinear conjugate gradient methods is proposed. Then, a brief global convergence analysis is made with and without convexity assumption on the objective function. Preliminary numerical results are reported; they demonstrate a proper choice for the parameter of the proposed class of conjugate gradient methods may lead to promising numerical performance.
Similar content being viewed by others
References
Andrei N (2010) Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur J Oper Res 204(3):410–420
Andrei N (2011) Open problems in conjugate gradient algorithms for unconstrained optimization. B Malays Math Sci Soc 34(2):319–330
Babaie-Kafaki S, Ghanbari R (2014) The Dai–Liao nonlinear conjugate gradient method with optimal parameter choices. Eur J Oper Res 234(3):625–630
Babaie-Kafaki S, Ghanbari R (2015) A hybridization of the Hestenes–Stiefel and Dai–Yuan conjugate gradient methods based on a least-squares approach. Optim Methods Softw 30(4):673–681
Babaie-Kafaki S, Ghanbari R (2015) Two optimal Dai–Liao conjugate gradient methods. Optimization 64(11):2277–2287
Dai YH, Liao LZ (2001) New conjugacy conditions and related nonlinear conjugate gradient methods. Appl Math Optim 43(1):87–101
Dolan ED, Moré JJ (2002) Benchmarking optimization software with performance profiles. Math Program 91(2, Ser. A):201–213
Gould NIM, Orban D, Toint PhL (2003) CUTEr: a constrained and unconstrained testing environment, revisited. ACM Trans Math Softw 29(4):373–394
Hager WW, Zhang H (2006) Algorithm 851: CG_Descent, a conjugate gradient method with guaranteed descent. ACM Trans Math Softw 32(1):113–137
Hager WW, Zhang H (2006) A survey of nonlinear conjugate gradient methods. Pac J Optim 2(1):35–58
Oren SS, Luenberger DG (1974) Self-scaling variable metric (SSVM) algorithms. I. Criteria and sufficient conditions for scaling a class of algorithms. Manag Sci 20(5):845–862
Powell MJD (1986) Convergence properties of algorithms for nonlinear optimization. SIAM Rev 28(4):487–500
Sun W, Yuan YX (2006) Optimization theory and methods: nonlinear programming. Springer, New York
Acknowledgments
This research was supported by Research Councils of Semnan University and Ferdowsi University of Mashhad. Also, the first author was supported in part by the grant 95813776 from Iran National Science Foundation (INSF). The authors are grateful to Professor William W. Hager for providing the C++ code of CG_Descent. They also thank the anonymous reviewers and the Associate Editor for their valuable comments and suggestions helped to improve the quality of this work.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Babaie-Kafaki, S., Ghanbari, R. A class of adaptive Dai–Liao conjugate gradient methods based on the scaled memoryless BFGS update. 4OR-Q J Oper Res 15, 85–92 (2017). https://doi.org/10.1007/s10288-016-0323-1
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10288-016-0323-1
Keywords
- Unconstrained optimization
- Conjugate gradient method
- Scaled memoryless BFGS update
- Frobenius norm
- Global convergence