Abstract
A new subspace minimization conjugate gradient algorithm with a nonmonotone Wolfe line search is proposed and analyzed. In the scheme, we propose two choices of the search direction by minimizing a quadratic approximation of the objective function in special subspaces, and state criterions on how to choose the direction. Under given conditions, we obtain the significant conclusion that each choice of the direction satisfies the sufficient descent property. Based on the idea on how the function is close to a quadratic function, a new strategy for choosing the initial stepsize is presented for the line search. With the used nonmonotone Wolfe line search, we prove the global convergence of the proposed method for general nonlinear functions under mild assumptions. Numerical comparisons are given with well-known CGOPT and CG_DESCENT and show that the proposed algorithm is very promising.
Similar content being viewed by others
References
Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)
Andrei, N.: Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Eur. J. Oper. Res. 204, 410–420 (2010)
Andrei, N.: An accelerated subspace minimization three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm. 65, 859–874 (2014)
Andrei, N.: A new three-term conjugate gradient algorithm for unconstrained optimization. Numer. Algorithm. 219, 6316–6327 (2015)
Babaie-Kafaki, S., Ghanbari, R.: A hybridization of the Hestenestiefel and DaiYuan conjugate gradient methods based on a least-squares approach. Optim. Methods Softw. 30, 673–681 (2015)
Barzilai, J., Borwein, J.M.: Two-point step size gradient methods. IMA J. Numer. Anal. 8, 141–148 (1988)
Bellavia, S., Morini, B.: Subspace trust-region methods for large bound-constrained nonlinear equations. SIAM J. Numer. Anal. 44, 1535–1555 (2006)
Branch, M.A., Coleman, T.F., Li, Y.: A subspace, interior, and conjugate gradient method for large-scale bound-constrained minimization problems. SIAM J. Sci. Comput. 21, 1–23 (1999)
Carlberg, K., Forstall, V., Tuminaro, R.: Krylov-subspace recycling via the POD-augmented conjugate gradient method. SIAM J. Matrix Anal. Appl. 37, 1304–1336 (2016)
Conn, A.R., Gould, N., Sartenaer, A., Toint, Ph.L.: On iterated-subspace minimization methods for nonlinear optimization. In: Adams, L., Nazareth, J.L. (eds.) Linear and Nonlinear Conjugate Gradient-Related Methods, pp. 50–78. SIAM (1996)
Dai, Y.H.: Nonlinear Conjugate Gradient Methods Wiley Encyclopedia of Operations Research and Management Science (2011). https://doi.org/10.1002/9780470400531.eorms0183
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Dai, Y.H., Kou, C.X.: A Barzilai-Borwein conjugate gradient method. Sci. China Math. 59, 1511–1524 (2016)
Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10, 177–182 (1999)
Dolan, E.D., Mor, J.J.: Benchmarking optimization software with performance profiles. Math. Programm. 91, 201–213 (2002)
Erway, J.B., Gill, P.E.: A subspace minimization method for the trust-region step. SIAM J. Optim. 20, 1439–1461 (2009)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEst: a Constrained and Unconstrained Testing Environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2015)
Hager, W.W., Zhang, H.: The limited memory conjugate gradient method. SIAM J. Optim. 23, 2150–2168 (2013)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. 49, 409–436 (1952)
Li, L.B.: A new algorithm for solving large scale trust region subproblem. Oper. Res. Manag. Sci. 16, 48–52 (2007)
Liu, Z.X., Liu, H.W.: Several efficient gradient methods with approximate optimal stepsizes for large scale unconstrained optimization. J. Comput. Appl. Math. 328, 400–413 (2018)
Liu, Z.X., Liu, H.W.: An efficient gradient method with approximate optimal stepsize for large-scale unconstrained optimization. Numer. Algorithms. https://doi.org/10.1007/s11075-017-0365-2
Nocedal, J., Wright, S.: Numerical Optimization, 2nd edn. Springer Series in Operations Research, New York (2006)
Polak, E., Ribire, G.: Note sur la convergence de mthodes de directions conjugues. Rev. Franaise Informat. Rech. Oprationnelle 16, 35–43 (1969)
Polyak, B.T.: The conjugate gradient method in extremal problems. Ussr Comput. Math. Math. Phys. 9, 94–112 (1969)
Radosaw, P.: Conjugate Gradient Algorithms in Nonconvex Optimization. Springer-Verlag, Berlin Heidelberg (2009)
Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)
Shanno, D.F., Phua, K.H.: Remark on algorithm 500. ACM Trans. Math. Softw. 6, 618–622 (1980)
Sugiki, K., Narushima, Y., Yabe, H.: Globally convergent three-term conjugate gradient methods that use secant conditions and generate descent search directions for unconstrained optimization. J. Optim. Theory Appl. 153, 733–757 (2012)
Wang, Z.H., Yuan, Y.: A subspace implementation of quasi-Newton trust region methods for unconstrained optimization. Numer. Math. 104, 241–269 (2006)
Wei, H.J., Yang, W.H.: A Riemannian subspace limited-memory SR1 trust-region method. Optim. Lett. 10, 1705–1723 (2016)
Yang, Y.T., Chen, Y.T., Lu, Y.L.: A subspace conjugate gradient algorithm for large-scale unconstrained optimization. Numerical Algorithms (2017). https://doi.org/10.1007/s11075-017-0284-2
Yuan, Y.: A modified BFGS algorithm for unconstrained optimization. IMA J. Numer. Anal. 11, 325–332 (1991)
Yuan, Y.: Subspace methods for large scale nonlinear equations and nonlinear least squares. Optim. Eng. 10, 207–218 (2009)
Yuan, Y.: A review on subspace methods for nonlinear optimization. In: Proceedings of the International Congress of Mathematics, pp. 807–827 (2014)
Yuan, Y., Stoer, J.: A subspace study on conjugate gradient algorithms. Zamm J. Appl. Math. Mech. Z. Angew. Math. Mech. 75, 69–77 (1995)
Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043–1056 (2004)
Zhang, L., Zhou, W., Li, D.: Global convergence of a modified Fletcher-Reeves conjugate gradient method with Armijo-type line search. Numer. Math. 104, 561–572 (2006)
Acknowledgements
This work is supported by the National Natural Science Foundation of China (Grant Nos. 11461021 and 61573014), Natural Science Basic Research Plan in Shangxi Province of China (No. 2017JM1014), and Scientific Research Project of Hezhou University (Nos. 2014YBZK06 and 2016HZXYSX03), and Guangxi Colleges and Universities Key Laboratory of Symbolic Computation and Engineering Data Processing (FH201701). The authors are grateful to Professor Hager, W.W. and Professor Zhang, H. for providing the CG_DESCENT code and Professor Dai, Y.H. and Dr. Kou, C.X. for the CGOPT code.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Li, M., Liu, H. & Liu, Z. A new subspace minimization conjugate gradient method with nonmonotone line search for unconstrained optimization. Numer Algor 79, 195–219 (2018). https://doi.org/10.1007/s11075-017-0434-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-017-0434-6
Keywords
- Conjugate gradient method
- Nonmonotone line search
- Subspace minimization
- Unconstrained optimization
- Global convergence