Skip to main content
Log in

Global convergence of the restarted Lanczos and Jacobi–Davidson methods for symmetric eigenvalue problems

  • Published:
Numerische Mathematik Aims and scope Submit manuscript

Abstract

The Lanczos method is well known for computing the extremal eigenvalues of symmetric matrices. For efficiency and robustness a restart strategy is employed in practice, but this makes an analysis of convergence less straightforward. We prove global convergence of the restarted Lanczos method in exact arithmetic using certain convergence properties of the Rayleigh–Ritz procedure, which can be obtained from the discussion by Crouzeix, Philippe, and Sadkane. For the restarted Lanczos method, Sorensen’s previous analysis establishes global convergence to the largest eigenvalues under the technical assumption that the absolute values of the off-diagonal elements of the Lanczos tridiagonal matrix are larger than a positive constant throughout the iterations. In this paper, we prove global convergence without any such assumption. The only assumption is that the initial vector is not orthogonal to any of the target exact eigenvectors. More importantly, our results are applicable to dynamic restarting procedures where the dimensions of the projection subspace are dynamically determined. In other words, our analysis can be applied to the recently proposed efficient restart strategies employed in the thick restarted Lanczos method. The convergence theorem is extended to the restarted Lanczos method for computing both the largest and smallest eigenvalues. Moreover, we derive certain global convergence theorems of the block Lanczos and Jacobi–Davidson methods, where, for both algorithms, the Ritz values are shown to converge to exact eigenvalues, although they are not necessarily extremal.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Baglama, J., Calvetti, D., Reichel, L.: Iterative methods for the computation of a few eigenvalues of a large symmetric matrix. BIT 36, 400–421 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  2. Baglama, J., Calvetti, D., Reichel, L.: IRBL: an implicitly restarted block Lanczos method for largescale Hermitian eigenproblems. SIAM J. Sci. Comput. 24, 1650–1677 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  3. Bai, Z., Demmel, J., Dongarra, J., Ruhe, A., van der Vorst, H.: Templates for the Solution of Algebraic Eigenvalue Problems: A Practical Guide. SIAM, Philadelphia (2000)

    Book  Google Scholar 

  4. Beattie, C., Embree, M., Rossi, J.: Convergence of restarted Krylov subspaces to invariant subspaces. SIAM J. Matrix Anal. Appl. 25, 1074–1109 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  5. Beattie, C., Embree, M., Sorensen, D.C.: Convergence of polynomial restart Krylov methods for eigenvalue computations. SIAM Rev. 47, 492–515 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  6. Calvetti, D., Reichel, L., Sorensen, D.C.: An implicitly restarted Lanczos method for large symmetric eigenvalue problems. Electron. Trans. Numer. Anal. 2, 1–21 (1994)

    MATH  MathSciNet  Google Scholar 

  7. Crouzeix, M., Philippe, B., Sadkane, M.: The Davidson method. SIAM J. Sci. Comput. 15, 62–76 (1994)

    Article  MATH  MathSciNet  Google Scholar 

  8. Cullum, J.K.: The simultaneous computation of a few of the algebraically largest and smallest eigenvalues of a large, symmetric, sparse matrix. BIT 18, 265–275 (1978)

    Article  MATH  MathSciNet  Google Scholar 

  9. Cullum, J.K., Donath, W.E.: A block Lanczos algorithm for computing the \(q\) algebraically largest eigenvalues and a corresponding eigenspace for large, sparse symmetric matrices. In: Proceedings of the 1994 IEEE Conference on Decision and Control, pp. 505–509 IEEE Press, Piscataway, NJ (1974)

  10. Davidson, E.R.: The iterative calculation of a few of the lowest eigenvalues and corresponding eigenvectors of large real-symmetric matrices. J. Comput. Phys. 17, 87–94 (1975)

    Article  MATH  Google Scholar 

  11. Demmel, J.: Applied Numerical Linear Algebra. SIAM, Philadelphia (1997)

    Book  MATH  Google Scholar 

  12. Golub, G.H., Underwood, R.: The block Lanczos method for computing eigenvalues. In: Rice, J. (ed.) Mathematical Software III, pp. 364–377. Academic Press, New York (1977)

    Google Scholar 

  13. Golub, G.H., Van Loan, C.F.: Matrix Computations, 4th edition. Johns Hopkins University (2013)

  14. Hochstenbach, M.E., Sleijpen, G.L.G.: Two-sided and alternating Jacobi–Davidson. Linear Algebra Appl. 358, 145–172 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  15. Horn, R.A., Johnson, C.R.: Matrix Analysis, 2nd edition. Cambridge University (2012)

  16. Karush, W.: An iterative method for finding characteristic vectors of a symmetric matrix. Pac. J. Math. 1, 233–248 (1951)

    Article  MATH  MathSciNet  Google Scholar 

  17. Knyazev, A.V.: Toward the optimal preconditioned eigensolver: locally optimal block preconditioned conjugate gradient method. SIAM J. Sci. Comput. 23, 517–541 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  18. Knyazev, A.V., Skorokhodov, A.L.: On exact estimates of the convergence rate of the steepest ascent method in the symmetric eigenvalue problem. Linear Algebra Appl. 154–156, 245–257 (1991)

    Article  MathSciNet  Google Scholar 

  19. Lanczos, C.: An iteration method for the solution of the eigenvalue problem of linear differential and integral operators. J. Res. Nat. Bur. Stand. 45, 255–282 (1950)

    Article  MathSciNet  Google Scholar 

  20. Li, R.C.: Sharpness in rates of convergence for the symmetric Lanczos method. Math. Comput. 79, 419–435 (2010)

    Article  MATH  Google Scholar 

  21. Morgan, R.B., Scott, D.S.: Generalizations of Davidson’s method for computing eigenvalues of sparse symmetric matrices. SIAM J. Sci. Stat. Comput. 7, 817–825 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  22. Notay, Y.: Is Jacobi–Davidson faster than Davidson? SIAM J. Matrix Anal. Appl. 26, 522–543 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  23. Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems I: the preconditioning aspect. SIAM J. Numer. Anal. 41, 258–271 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  24. Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems II: the preconditioning aspect. SIAM J. Numer. Anal. 41, 272–286 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  25. Parlett, B.N.: The Symmetric Eigenvalue Problem. Prentice-Hall, Englewood Cliffs (1980)

    MATH  Google Scholar 

  26. Ruhe, A.: Implementation aspects of band Lanczos algorithms for computation of eigenvalues of large sparse symmetric matrices. Math. Comput. 33, 680–687 (1979)

    Article  MATH  MathSciNet  Google Scholar 

  27. Sleijpen, G.L.G., van der Vorst, A.: A Jacobi-Davidson iteration method for linear eigenvalue problems. SIAM J. Matrix Anal. Appl. 17, 401–425 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  28. Sorensen, D.C.: Implicit application of polynomial filters in a \(k\)-step Arnoldi method. SIAM J. Matrix Anal. Appl. 13, 357–385 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  29. Stathopoulos, A.: Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part I: seeking one eigenvalue. SIAM J. Sci. Comput. 29, 481–514 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  30. Stathopoulos, A., Mccombs, J.R.: Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part II: seeking many eigenvalues. SIAM J. Sci. Comput. 29, 2162–2188 (2007)

    Article  MATH  MathSciNet  Google Scholar 

  31. Wu, K., Simon, H.: Thick-restarted Lanczos method for large symmetric eigenvalue problems. SIAM J. Matrix Anal. Appl. 22, 602–616 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  32. Yamazaki, I., Bai, Z., Simon, H.D., Wang, L.-W., Wu, K.: Adaptive projection subspace dimension for the thick-restart Lanczos method. ACM Trans. Math. Softw. 37, (2010) (Article No. 27). doi:10.1145/1824801.1824805

Download references

Acknowledgments

The author is grateful to Masaaki Sugihara, Yusaku Yamamoto, and Yuji Nakatsukasa for their valuable comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kensuke Aishima.

Additional information

The author is supported by JSPS Grant-in-Aid for Young Scientists (Grant Number 25790096).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Aishima, K. Global convergence of the restarted Lanczos and Jacobi–Davidson methods for symmetric eigenvalue problems. Numer. Math. 131, 405–423 (2015). https://doi.org/10.1007/s00211-015-0699-4

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00211-015-0699-4

Mathematics Subject Classification