Skip to main content
Log in

A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

A new value for the parameter in Dai and Liao conjugate gradient algorithm is presented. This is based on the clustering of eigenvalues of the matrix which determine the search direction of this algorithm. This value of the parameter lead us to a variant of the Dai and Liao algorithm which is more efficient and more robust than the variants of the same algorithm based on minimizing the condition number of the matrix associated to the search direction. Global convergence of this variant of the algorithm is briefly discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Andrei, N.: Acceleration of conjugate gradient algorithms for unconstrained optimization. Appl. Math. Comput. 213, 361–369 (2009)

    MathSciNet  MATH  Google Scholar 

  2. Andrei, N.: An unconstrained optimization test functions collection. Adv. Model. Optim. 10, 147–161 (2008)

    MathSciNet  MATH  Google Scholar 

  3. Andrei, N.: Open problems in nonlinear conjugate gradient algorithms for unconstrained optimization. Bull. Malays. Math. Sci. Soc. 34, 319–330 (2011)

    MathSciNet  MATH  Google Scholar 

  4. Axelsson, O.: A class of iterative methods for finite element equations. Comput. Methods Appl. Mech. Eng. 9, 123–137 (1976)

    Article  MathSciNet  MATH  Google Scholar 

  5. Axelsson, O., Lindskog, G.: On the rate of convergence of the preconditioned conjugate gradient methods. Numer. Math. 48, 499–523 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  6. Babaie-Kafaki, S., Ghanbari, R.: The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices. Eur. J. Oper. Res. 234, 625–630 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  7. Bongartz, I., Conn, A.R., Gould, N.I.M., Toint, Ph.L.: CUTEr: constrained and unconstrained testing environments. ACM Trans. Math. Softw. 21, 123–160 (1995)

    Article  MATH  Google Scholar 

  8. Dai, Y.H., Kou, C.X.: A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. SIAM J. Optim. 23, 296–320 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  9. Dai, Y.H., Liao, L.Z.: New conjugate conditions and related nonlinear conjugate gradient methods. Appl. Math. Optim. 43, 87–101 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  10. Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91, 201–213 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  11. Hager, W.W., Zhang, H.: A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16, 170–192 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  12. Hestenes, M.R., Steifel, E.: Methods of conjugate gradients for solving linear systems. J. Res. Natl. Bur. Stand. Sec. B 48, 409–436 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  13. Kaporin, I.E.: New convergence results and preconditioning strategies for the conjugate gradient methods. Numer. Linear Algebra Appl. 1, 179–210 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  14. Kratzer, D., Parter, S.V., Steuerwalt, M.: Block splittings for the conjugate gradient method. Comput. Fluids 11, 255–279 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  15. Meurant, G.: Computer solution of large linear systems. Studies in Mathematics and its Applications, vol 28. North Holland, Elsevier, Amsterdam (1999)

    Google Scholar 

  16. Pestana, J., Wathen, A.J.: On the choice of preconditioner for minimum residual methods for non-Hermitian matrices. J. Comput. Appl. Math. 249, 57–68 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  17. Reid, J.K.: On the method of conjugate gradients for solution of large sparse systems of linear equations. In: Reid, J.K. (ed.) Large Sparse Sets of Linear Equations, pp 231–254. Academic Press, London (1971)

    Google Scholar 

  18. Strakoš, Z.: On the real convergence rate of the conjugate gradient method. Linear Algebra Appl. 154–156, 535–549 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  19. Sun, W., Yuan, Y.X.: Optimization theory and methods. Nonlinear Programming. Springer Science + Business Media, New York (2006)

    MATH  Google Scholar 

  20. Van der Sluis, A., Van der Vorst, H.A.: The rate of convergence of conjugate gradients. Numer. Math. 48, 543–560 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  21. Winther, R.: Some superlinear convergence results for the conjugate gradient method. SIAM J. Numer. Anal. 17, 14–17 (1980)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neculai Andrei.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrei, N. A Dai-Liao conjugate gradient algorithm with clustering of eigenvalues. Numer Algor 77, 1273–1282 (2018). https://doi.org/10.1007/s11075-017-0362-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-017-0362-5

Keywords

Navigation