Skip to main content

Advertisement

Log in

Globally linearly convergent nonlinear conjugate gradients without Wolfe line search

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

This paper introduces a measure for zigzagging strength and a minimal zigzagging direction. Based on this, a new nonlinear conjugate gradient (CG) method is proposed that works with line searches not satisfying the Wolfe condition. Global convergence to a stationary point is proved for differentiable objective functions with Lipschitz continuous gradient, and global linear convergence if this stationary point is a strong local minimizer. For approximating a stationary point, an \(\mathcal{O}(\varepsilon ^{-2})\) complexity bound is derived for the number of function and gradient evaluations. This bound improves to \(\mathcal{O}(\log \varepsilon ^{-1})\) for objective functions having a strong minimizer and no other stationary points. For strictly convex quadratic functions in n variables, the new method terminates in at most n iterations. Numerical results on the unconstrained CUTEst test problems suggest that the new method is competitive with the best nonlinear state-of-the-art CG methods proposed in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Algorithm 3
Fig. 1

Similar content being viewed by others

Data availability

No data was generated or analyzed.

References

  1. Amini, K., Faramarzi, P., Pirfalah, N.: A modified Hestenes-Stiefel conjugate gradient method with an optimal property. Optim. Methods Softw. 34, 770–782 (2018)

    Article  MathSciNet  Google Scholar 

  2. Aminifard, Z., Babaie-Kafaki, S.: Dai-Liao extensions of a descent hybrid nonlinear conjugate gradient method with application in signal processing. Numer. Algorithms 89, 1369–1387 (2022)

    Article  MathSciNet  Google Scholar 

  3. Axelsson, O., Lindskog, G.: On the rate of convergence of the preconditioned conjugate gradient method. Numer. Math. 48, 499–523 (1986)

    Article  MathSciNet  Google Scholar 

  4. Babaie-Kafaki, S., Ghanbari, R.: Two modified three-term conjugate gradient methods with sufficient descent property. Optim. Lett. 8, 2285–2297 (2014)

    Article  MathSciNet  Google Scholar 

  5. Beale, E.M.: A deviation of conjugate gradients. Numerical methods for nonlinear optimization. 39–43 (1972)

  6. Cartis, C., Sampaio, Ph.R., Toint, Ph.L.: Worst-case evaluation complexity of non-monotone gradient-related algorithms for unconstrained optimization. Optimization 64, 1349–1361 (2015)

    Article  MathSciNet  Google Scholar 

  7. Chan-Renous-Legoubin, R., Royer, C.W.: A nonlinear conjugate gradient method with complexity guarantees and its application to nonconvex regression. EURO J. Comput. Optim. 10, 100044 (2022)

    Article  MathSciNet  Google Scholar 

  8. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  Google Scholar 

  9. Dai, Y.H., Liao, L.Z.: New conjugacy conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  Google Scholar 

  10. Dai, Y., Yuan, Y.: Convergence properties of Beale-Powell restart algorithm. Sci. China Ser. A-Math. 41, 1142–1150 (1998)

    Article  MathSciNet  Google Scholar 

  11. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (January 2002)

    Article  MathSciNet  Google Scholar 

  12. Faramarzi, P., Amini, K.: A modified spectral conjugate gradient method with global convergence. J. Optim. Theory Appl. 182, 667–690 (2019)

    Article  MathSciNet  Google Scholar 

  13. Fletcher, R.: Practical methods of optimization. John Wiley & Sons, Ltd (2000)

    Book  Google Scholar 

  14. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Computer J. 7, 149–154 (1964)

    Article  MathSciNet  Google Scholar 

  15. Goldstein, A.A.: On steepest descent. J. SIAM, Ser. A: Control 3, 147–151 (1965)

    MathSciNet  Google Scholar 

  16. Gould, N.I.M., Orban, D., Toint, Ph.L.: CUTEst: a constrained and unconstrained testing environment with safe threads for mathematical optimization. Comput. Optim. Appl. 60, 545–557 (2015)

    Article  MathSciNet  Google Scholar 

  17. Hager, W.W., Zhang, H.: CG_DESCENT user’s guide. Technical report, Department of Mathematics, University of Florida, Gainesville, FL (2004)

  18. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  Google Scholar 

  19. Hager, W.W., Zhang, H.: Algorithm 851: CG_DESCENT, a conjugate gradient method with guaranteed descent. ACM Trans. Math. Softw. 32, 113–137 (2006)

    Article  Google Scholar 

  20. Hager, W.W., Zhang, H.: A new active set algorithm for box constrained optimization. SIAM J. Optim. 17, 526–557 (2006)

    Article  MathSciNet  Google Scholar 

  21. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2, 35–58 (2006)

    MathSciNet  Google Scholar 

  22. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  Google Scholar 

  23. Ibrahim, A.H., Kumam, P., Kamandi, A., Abubakar, A.B.: An efficient hybrid conjugate gradient method for unconstrained optimization. Optim. Methods Softw. 37, 1370–1383 (2022)

    Article  MathSciNet  Google Scholar 

  24. Kimiaei, M., Neumaier, A., Azmi, B.: LMBOPT: a limited memory method for bound-constrained optimization. Math. Program. Comput. 14, 271–318 (2022)

    Article  MathSciNet  Google Scholar 

  25. Liu, Z., Liu, H., Dai, Y.H.: An improved Dai-Kou conjugate gradient algorithm for unconstrained optimization. Comput. Optim. Appl. 75, 145–167 (2020)

    Article  MathSciNet  Google Scholar 

  26. Liu, Y., Storey, C.: Efficient generalized conjugate gradient algorithms, part 1: theory. J. Optim. Theory Appl. 69, 129–137 (1991)

    Article  MathSciNet  Google Scholar 

  27. Lotfi, M., Hosseini, S.M.: An efficient hybrid conjugate gradient method with sufficient descent property for unconstrained optimization. Optim. Methods Softw. 37, 1725–1739 (2022)

    Article  MathSciNet  Google Scholar 

  28. Narushima, Y., Yabe, H., Ford, J.A.: A three-term conjugate gradient method with sufficient descent property for unconstrained optimization. SIAM J. Optim. 21, 212–230 (2011)

    Article  MathSciNet  Google Scholar 

  29. Neumaier, A., Azmi, B.: Line search and convergence in bound-constrained optimization. Unpublished manuscript, University of Vienna (2019). http://www.optimization-online.org/DB_HTML/2019/03/7138.html

  30. Neumaier, A., Kimiaei, M.: An improvement of the Goldstein line search. Preprint, University of Vienna (2022). https://optimization-online.org/?p=21115

  31. Nocedal, J., Wright, S.: Numerical optimization. Springer Science & Business Media (2006)

  32. Mirhoseini, N., Babaie-Kafaki, S., Aminifard, Z.: A nonmonotone scaled Fletcher-Reeves conjugate gradient method with application in image reconstruction. Bull. Malays. Math. Sci. Soc. 45, 2885–2904 (2022)

    Article  MathSciNet  Google Scholar 

  33. Moré, J.J., Thuente, D.J.: Line search algorithms with guaranteed sufficient decrease. (ACM) Trans. Math. Softw. 20, 286–307 (1994)

    Article  MathSciNet  Google Scholar 

  34. Ortega, J.M., Werner, C.R.: Iterative solution of nonlinear equations in several variables. Society for Industrial and Applied Mathematics (2000)

  35. Polak, E., Ribière, G.: Note sur la convergence de directions conjugées. Rev. Francaise Inf. Recherche Oper. 3e Année 16, 35–43 (1969)

    Google Scholar 

  36. Powell, M.J.D.: Convergence properties of algorithms for nonlinear optimization. SIAM Rev. 28, 487–500 (1986)

    Article  MathSciNet  Google Scholar 

  37. Powell, M.J.D.: Restart procedures for the conjugate gradient method. Math. Program. 12, 241–254 (1977)

    Article  MathSciNet  Google Scholar 

  38. Sun, W., Yuan, Y.X.: Optimization theory and methods: nonlinear programming. Springer, Science & Business Media (2006)

  39. Warth, W., Werner, J.: Effiziente Schrittweitenfunktionen bei unrestringierten Optimierungsaufgaben. Computing 19, 59–72 (1977)

    Article  MathSciNet  Google Scholar 

  40. Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1969)

    Article  MathSciNet  Google Scholar 

  41. Yuan, G., Yang, H., Zhang, M.: Adaptive three-term PRP algorithms without gradient Lipschitz continuity condition for nonconvex functions. Numer. Algorithms 91, 145–160 (2022)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

Earlier versions of this paper benefitted from discussions with Waltraud Huyer and Hermann Schichl.

Funding

The second author acknowledges the financial support of the Austrian Science Foundation under Project No. P 34317.

Author information

Authors and Affiliations

Authors

Contributions

All authors wrote the main manuscript text and reviewed the manuscript. The second author wrote codes and provided the numerical results.

Corresponding author

Correspondence to Morteza Kimiaei.

Ethics declarations

Conflict of interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Neumaier, A., Kimiaei, M. & Azmi, B. Globally linearly convergent nonlinear conjugate gradients without Wolfe line search. Numer Algor 97, 1607–1633 (2024). https://doi.org/10.1007/s11075-024-01764-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-024-01764-5

Keywords

Mathematics Subject Classification (2010)