# Projections and Conjugate Gradients

## G.W. Stewart (Maryland)

Conjugate gradient methods and their relatives (such the minimum
residual method) are widely used to solve large sparse linear systems
and to regularize ill-posed problems. The usual approach to the
analysis of these methods is to recognize that the successive
approximate solutions are polynomials in the matrix of the system
operating on the residual vector. Since the approximations satisfy an
optimality condition, one can obtain upper bounds on the accuracy of
the approximations by selecting polynomials that are nearly optimal.
Unfortunately, the results are averaged over the spectrum of the
matrix, so that it is difficult to see how the method converges along
the individual eigenvectors.
In this talk we describe a new approach. Here the approximate
solutions (in the coordinates of the eigensystem of the matrix) are
expressed in terms of orthogonal projections involving certain
modified Krylov subspaces. Convergence in a component is signaled by
the approach of the corresponding row of the projection to a row of
the identity matrix, and bounds can be given for the rate of
convergence. This componentwise approach may be more suitable than
the usual techniques for analyzing the behavior of the methods for
ill-posed problems.