Skip to main content
Log in

A Nonlinear Conjugate Gradient Method Using Inexact First-Order Information

  • Published:
Journal of Optimization Theory and Applications Aims and scope Submit manuscript

Abstract

Conjugate gradient methods are widely used for solving nonlinear optimization problems. In some practical problems, we can only get approximate values of the objective function and its gradient. It is necessary to consider optimization algorithms that use inexact function evaluations and inexact gradients. In this paper, we propose an inexact nonlinear conjugate gradient (INCG) method to solve such problems. Under some mild conditions, the global convergence of INCG is proved. Specifically, we establish the linear convergence of INCG when the objective function is strongly convex. Numerical results demonstrate that, compared to the state-of-the-art algorithms, INCG is an effective method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10(1), 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: Nonlinear Conjugate Gradient Methods for Unconstrained Optimization, Springer Optimization and Its Applications, vol. 158. Springer, Cham (2020)

    Google Scholar 

  3. Bellavia, S., Gurioli, G., Morini, B., Toint, P.L.: Adaptive regularization algorithms with inexact evaluations for nonconvex optimization. SIAM J. Optim. 29(4), 2881–2915 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  4. Bottou, L., Curtis, F.E., Nocedal, J.: Optimization methods for large-scale machine learning. SIAM Rev. 60(2), 223–311 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  5. Burden, R.L., Faires, J.D., Reynolds, A.C.: Numerical Analysis. Prindle, Weber & Schmidt, Boston, Mass (1978)

    MATH  Google Scholar 

  6. Carter, R.G.: Numerical experience with a class of algorithms for nonlinear optimization using inexact function and gradient information. SIAM J. Sci. Comput. 14(2), 368–388 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  7. Cartis, C., Gould, N.I.M., Toint, P.L.: Sharp worst-case evaluation complexity bounds for arbitrary-order nonconvex optimization with inexpensive constraints. SIAM J. Optim. 30(1), 513–541 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  8. Cheng, W.: A two-term PRP-based descent method. Numer. Funct. Anal. Optim. 28(11–12), 1217–1230 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  9. Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-Free Optimization, MPS/SIAM Series on Optimization, vol. 8. Society for Industrial and Applied Mathematics (SIAM), Philadelphia (2009)

    Book  MATH  Google Scholar 

  10. Dai, Y.H., Yuan, Y.: A nonlinear conjugate gradient method with a strong global convergence property. SIAM J. Optim. 10(1), 177–182 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  11. Devolder, O.: Exactness, inexactness and stochasticity in first-order methods for large-scale convex optimization. Ph.D. thesis, ICTEAM and CORE, Université Catholique de Louvain (2013)

  12. Devolder, O., Glineur, F., Nesterov, Y.: First-order methods of smooth convex optimization with inexact oracle. Math. Program. 146(1–2), 37–75 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  13. Dong, X., Han, D., Dai, Z., Li, L., Zhu, J.: An accelerated three-term conjugate gradient method with sufficient descent condition and conjugacy condition. J. Optim. Theory Appl. 179(3), 944–961 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  14. Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964)

    Article  MathSciNet  MATH  Google Scholar 

  15. Gannot, O.: A frequency-domain analysis of inexact gradient methods. Math. Program. 194(1–2), 975–1016 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  16. Gilbert, J.C., Nocedal, J.: Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2(1), 21–42 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  17. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16(1), 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  18. Hager, W.W., Zhang, H.: A survey of nonlinear conjugate gradient methods. Pac. J. Optim. 2(1), 35–58 (2006)

    MathSciNet  MATH  Google Scholar 

  19. Hestenes, M.R., Stiefel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Nat. Bur. Stand. 49, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  20. Hu, Y.F., Storey, C.: Global convergence result for conjugate gradient methods. J. Optim. Theory Appl. 71(2), 399–405 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  21. Larson, J., Menickelly, M., Wild, S.M.: Derivative-free optimization methods. Acta Numer 28, 287–404 (2019)

    Article  MathSciNet  MATH  Google Scholar 

  22. Li, Q.: Conjugate gradient type methods for the nondifferentiable convex minimization. Optim. Lett. 7(3), 533–545 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  23. Nesterov, Y.: Lectures on Convex Optimization, Springer Optimization and its Applications, vol. 137. Springer, Cham (2018)

    Book  MATH  Google Scholar 

  24. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2006)

    MATH  Google Scholar 

  25. Polak, E., Ribière, G.: Note sur la convergence de méthodes de directions conjuguées. Rev. Française Informat. Recherche Opérationnelle 3(16), 35–43 (1969)

    MathSciNet  MATH  Google Scholar 

  26. Polyak, B.T.: The conjugate gradient method in extremal problems. USSR Comput. Math. Math. Phys. 9(4), 94–112 (1969)

    Article  MATH  Google Scholar 

  27. Sun, W., Yuan, Y.X.: Optimization Theory and Methods, Springer Optimization and its Applications, vol. 1. Springer, New York (2006)

    Google Scholar 

  28. Yuan, G., Duan, X., Liu, W., Wang, X., Cui, Z., Sheng, Z.: Two new PRP conjugate gradient algorithms for minimization optimization models. PLoS ONE 10(10), e0140071 (2015)

    Article  Google Scholar 

  29. Zhang, L., Zhou, W., Li, D.H.: A descent modified Polak-Ribière-Polyak conjugate gradient method and its global convergence. IMA J. Numer. Anal. 26(4), 629–640 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China NSFC-11971118. The authors are grateful to the associate editor and the two anonymous referees for their valuable comments and suggestions. Their comments and suggestions have improved the presentation of the paper significantly.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tiantian Zhao.

Additional information

Communicated by Emanuele Galligani.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, T., Yang, W.H. A Nonlinear Conjugate Gradient Method Using Inexact First-Order Information. J Optim Theory Appl 198, 502–530 (2023). https://doi.org/10.1007/s10957-023-02243-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10957-023-02243-y

Keywords

Mathematics Subject Classification

Navigation