Skip to main content
Log in

A note on conjugate gradient convergence – Part III

  • Original article
  • Published:
Numerische Mathematik Aims and scope Submit manuscript

Summary. In this paper we again consider the rate of convergence of the conjugate gradient method. We start with a general analysis of the conjugate gradient method for uniformly bounded solutions vectors and matrices whose eigenvalues are uniformly bounded and positive. We show that in such cases a fixed finite number of iterations of the method gives some fixed amount of improvement as the the size of the matrix tends to infinity. Then we specialize to the finite element (or finite difference) scheme for the problem \(y''(x) = g_\beta(x), y(0) = y(1) = 0\). We show that for some classes of function \(g_\beta\) we see this same effect. For other functions we show that the gain made by performing a fixed number of iterations of the method tends to zero as the size of the matrix tends to infinity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received July 9, 1998 / Published online March 16, 2000

Rights and permissions

Reprints and permissions

About this article

Cite this article

Engelberg, S., Naiman, A. A note on conjugate gradient convergence – Part III. Numer. Math. 85, 685–696 (2000). https://doi.org/10.1007/PL00005397

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/PL00005397

Navigation