Abstract
The Lanczos method is well known for computing the extremal eigenvalues of symmetric matrices. For efficiency and robustness a restart strategy is employed in practice, but this makes an analysis of convergence less straightforward. We prove global convergence of the restarted Lanczos method in exact arithmetic using certain convergence properties of the Rayleigh–Ritz procedure, which can be obtained from the discussion by Crouzeix, Philippe, and Sadkane. For the restarted Lanczos method, Sorensen’s previous analysis establishes global convergence to the largest eigenvalues under the technical assumption that the absolute values of the off-diagonal elements of the Lanczos tridiagonal matrix are larger than a positive constant throughout the iterations. In this paper, we prove global convergence without any such assumption. The only assumption is that the initial vector is not orthogonal to any of the target exact eigenvectors. More importantly, our results are applicable to dynamic restarting procedures where the dimensions of the projection subspace are dynamically determined. In other words, our analysis can be applied to the recently proposed efficient restart strategies employed in the thick restarted Lanczos method. The convergence theorem is extended to the restarted Lanczos method for computing both the largest and smallest eigenvalues. Moreover, we derive certain global convergence theorems of the block Lanczos and Jacobi–Davidson methods, where, for both algorithms, the Ritz values are shown to converge to exact eigenvalues, although they are not necessarily extremal.
Similar content being viewed by others
References
Baglama, J., Calvetti, D., Reichel, L.: Iterative methods for the computation of a few eigenvalues of a large symmetric matrix. BIT 36, 400–421 (1996)
Baglama, J., Calvetti, D., Reichel, L.: IRBL: an implicitly restarted block Lanczos method for largescale Hermitian eigenproblems. SIAM J. Sci. Comput. 24, 1650–1677 (2003)
Bai, Z., Demmel, J., Dongarra, J., Ruhe, A., van der Vorst, H.: Templates for the Solution of Algebraic Eigenvalue Problems: A Practical Guide. SIAM, Philadelphia (2000)
Beattie, C., Embree, M., Rossi, J.: Convergence of restarted Krylov subspaces to invariant subspaces. SIAM J. Matrix Anal. Appl. 25, 1074–1109 (2004)
Beattie, C., Embree, M., Sorensen, D.C.: Convergence of polynomial restart Krylov methods for eigenvalue computations. SIAM Rev. 47, 492–515 (2005)
Calvetti, D., Reichel, L., Sorensen, D.C.: An implicitly restarted Lanczos method for large symmetric eigenvalue problems. Electron. Trans. Numer. Anal. 2, 1–21 (1994)
Crouzeix, M., Philippe, B., Sadkane, M.: The Davidson method. SIAM J. Sci. Comput. 15, 62–76 (1994)
Cullum, J.K.: The simultaneous computation of a few of the algebraically largest and smallest eigenvalues of a large, symmetric, sparse matrix. BIT 18, 265–275 (1978)
Cullum, J.K., Donath, W.E.: A block Lanczos algorithm for computing the \(q\) algebraically largest eigenvalues and a corresponding eigenspace for large, sparse symmetric matrices. In: Proceedings of the 1994 IEEE Conference on Decision and Control, pp. 505–509 IEEE Press, Piscataway, NJ (1974)
Davidson, E.R.: The iterative calculation of a few of the lowest eigenvalues and corresponding eigenvectors of large real-symmetric matrices. J. Comput. Phys. 17, 87–94 (1975)
Demmel, J.: Applied Numerical Linear Algebra. SIAM, Philadelphia (1997)
Golub, G.H., Underwood, R.: The block Lanczos method for computing eigenvalues. In: Rice, J. (ed.) Mathematical Software III, pp. 364–377. Academic Press, New York (1977)
Golub, G.H., Van Loan, C.F.: Matrix Computations, 4th edition. Johns Hopkins University (2013)
Hochstenbach, M.E., Sleijpen, G.L.G.: Two-sided and alternating Jacobi–Davidson. Linear Algebra Appl. 358, 145–172 (2003)
Horn, R.A., Johnson, C.R.: Matrix Analysis, 2nd edition. Cambridge University (2012)
Karush, W.: An iterative method for finding characteristic vectors of a symmetric matrix. Pac. J. Math. 1, 233–248 (1951)
Knyazev, A.V.: Toward the optimal preconditioned eigensolver: locally optimal block preconditioned conjugate gradient method. SIAM J. Sci. Comput. 23, 517–541 (2001)
Knyazev, A.V., Skorokhodov, A.L.: On exact estimates of the convergence rate of the steepest ascent method in the symmetric eigenvalue problem. Linear Algebra Appl. 154–156, 245–257 (1991)
Lanczos, C.: An iteration method for the solution of the eigenvalue problem of linear differential and integral operators. J. Res. Nat. Bur. Stand. 45, 255–282 (1950)
Li, R.C.: Sharpness in rates of convergence for the symmetric Lanczos method. Math. Comput. 79, 419–435 (2010)
Morgan, R.B., Scott, D.S.: Generalizations of Davidson’s method for computing eigenvalues of sparse symmetric matrices. SIAM J. Sci. Stat. Comput. 7, 817–825 (1986)
Notay, Y.: Is Jacobi–Davidson faster than Davidson? SIAM J. Matrix Anal. Appl. 26, 522–543 (2005)
Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems I: the preconditioning aspect. SIAM J. Numer. Anal. 41, 258–271 (2003)
Ovtchinnikov, E.: Convergence estimates for the generalized Davidson method for symmetric eigenvalue problems II: the preconditioning aspect. SIAM J. Numer. Anal. 41, 272–286 (2003)
Parlett, B.N.: The Symmetric Eigenvalue Problem. Prentice-Hall, Englewood Cliffs (1980)
Ruhe, A.: Implementation aspects of band Lanczos algorithms for computation of eigenvalues of large sparse symmetric matrices. Math. Comput. 33, 680–687 (1979)
Sleijpen, G.L.G., van der Vorst, A.: A Jacobi-Davidson iteration method for linear eigenvalue problems. SIAM J. Matrix Anal. Appl. 17, 401–425 (1996)
Sorensen, D.C.: Implicit application of polynomial filters in a \(k\)-step Arnoldi method. SIAM J. Matrix Anal. Appl. 13, 357–385 (1992)
Stathopoulos, A.: Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part I: seeking one eigenvalue. SIAM J. Sci. Comput. 29, 481–514 (2007)
Stathopoulos, A., Mccombs, J.R.: Nearly optimal preconditioned methods for Hermitian eigenproblems under limited memory. Part II: seeking many eigenvalues. SIAM J. Sci. Comput. 29, 2162–2188 (2007)
Wu, K., Simon, H.: Thick-restarted Lanczos method for large symmetric eigenvalue problems. SIAM J. Matrix Anal. Appl. 22, 602–616 (2000)
Yamazaki, I., Bai, Z., Simon, H.D., Wang, L.-W., Wu, K.: Adaptive projection subspace dimension for the thick-restart Lanczos method. ACM Trans. Math. Softw. 37, (2010) (Article No. 27). doi:10.1145/1824801.1824805
Acknowledgments
The author is grateful to Masaaki Sugihara, Yusaku Yamamoto, and Yuji Nakatsukasa for their valuable comments and suggestions.
Author information
Authors and Affiliations
Corresponding author
Additional information
The author is supported by JSPS Grant-in-Aid for Young Scientists (Grant Number 25790096).
Rights and permissions
About this article
Cite this article
Aishima, K. Global convergence of the restarted Lanczos and Jacobi–Davidson methods for symmetric eigenvalue problems. Numer. Math. 131, 405–423 (2015). https://doi.org/10.1007/s00211-015-0699-4
Received:
Revised:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00211-015-0699-4