Abstract
This paper introduces a measure for zigzagging strength and a minimal zigzagging direction. Based on this, a new nonlinear conjugate gradient (CG) method is proposed that works with line searches not satisfying the Wolfe condition. Global convergence to a stationary point is proved for differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. For approximating a stationary point, an \(\mathcal{O}(\varepsilon ^{-2})\) complexity bound is derived for the number of function and gradient evaluations. This bound improves to \(\mathcal{O}(\log \varepsilon ^{-1})\) for objective functions having a strong minimizer and no other stationary points. For strictly convex quadratic functions in n variables, the new method terminates in at most n iterations. Numerical results on the unconstrained CUTEst test problems suggest that the new method is competitive with the best nonlinear state-of-the-art CG methods proposed in the literature.




Similar content being viewed by others
Data availability
No data was generated or analyzed.
References
Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34, 770–782 (2018)
Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89, 1369–1387 (2022)
Axelsson, O., Lindskog, G.: On the rate of convergence of the preconditioned conjugate gradient method. Numer. Math. 48, 499–523 (1986)
Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8, 2285–2297 (2014)
Beale, E.M.: A deviation of conjugate gradients. Numerical methods for nonlinear optimization. 39–43 (1972)
Cartis, C., Sampaio, Ph.R., Toint, Ph.L.: Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization. Optimization 64, 1349–1361 (2015)
Chan-Renous-Legoubin, R., Royer, C.W.: A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression. EURO J. Comput. Optim. 10, 100044 (2022)
Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)
Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)
Dai, Y., Yuan, Y.: Convergence properties of Beale-Powell restart algorithm. Sci. China Ser. A-Math. 41, 1142–1150 (1998)
Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (January 2002)
Faramarzi, P., Amini, K.: A modified spectral conjugate gradient method with global convergence. J. Optim. Theory Appl. 182, 667–690 (2019)
Fletcher, R.: Practical methods of optimization. John Wiley & Sons, Ltd (2000)
Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Computer J. 7, 149–154 (1964)
Goldstein, A.A.: On steepest descent. J. SIAM, Ser. A: Control 3, 147–151 (1965)
Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2015)
Hager, W.W., Zhang, H.: CG_DESCENT user’s guide. Technical report, Department of Mathematics, University of Florida, Gainesville, FL (2004)
Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)
Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)
Hager, W.W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17, 526–557 (2006)
Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)
Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)
Ibrahim, A.H., Kumam, P., Kamandi, A., Abubakar, A.B.: An efficient hybrid conjugate gradient method for unconstrained optimization. Optim. Methods Softw. 37, 1370–1383 (2022)
Kimiaei, M., Neumaier, A., Azmi, B.: LMBOPT: a limited memory method for bound-constrained optimization. Math. Program. Comput. 14, 271–318 (2022)
Liu, Z., Liu, H., Dai, Y.H.: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75, 145–167 (2020)
Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)
Lotfi, M., Hosseini, S.M.: An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization. Optim. Methods Softw. 37, 1725–1739 (2022)
Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)
Neumaier, A., Azmi, B.: Line search and convergence in bound-constrained optimization. Unpublished manuscript, University of Vienna (2019). http://www.optimization-online.org/DB_HTML/2019/03/7138.html
Neumaier, A., Kimiaei, M.: An improvement of the Goldstein line search. Preprint, University of Vienna (2022). https://optimization-online.org/?p=21115
Nocedal, J., Wright, S.: Numerical optimization. Springer Science & Business Media (2006)
Mirhoseini, N., Babaie-Kafaki, S., Aminifard, Z.: A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction. Bull. Malays. Math. Sci. Soc. 45, 2885–2904 (2022)
Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. (ACM) Trans. Math. Softw. 20, 286–307 (1994)
Ortega, J.M., Werner, C.R.: Iterative solution of nonlinear equations in several variables. Society for Industrial and Applied Mathematics (2000)
Polak, E., Ribière, G.: Note sur la convergence de directions conjugées. Rev. Francaise Inf. Recherche Oper. 3e Année 16, 35–43 (1969)
Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)
Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)
Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, Science & Business Media (2006)
Warth, W., Werner, J.: Effiziente Schrittweitenfunktionen bei unrestringierten Optimierungsaufgaben. Computing 19, 59–72 (1977)
Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)
Yuan, G., Yang, H., Zhang, M.: Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions. Numer. Algorithms 91, 145–160 (2022)
Acknowledgements
Earlier versions of this paper benefitted from discussions with Waltraud Huyer and Hermann Schichl.
Funding
The second author acknowledges the financial support of the Austrian Science Foundation under Project No. P 34317.
Author information
Authors and Affiliations
Contributions
All authors wrote the main manuscript text and reviewed the manuscript. The second author wrote codes and provided the numerical results.
Corresponding author
Ethics declarations
Conflict of interest
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Neumaier, A., Kimiaei, M. & Azmi, B. Globally linearly convergent nonlinear conjugate gradients without Wolfe line search. Numer Algor 97, 1607–1633 (2024). https://doi.org/10.1007/s11075-024-01764-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11075-024-01764-5