Next: 2.3.5 Conjugate Gradient Method
Up: 2.3 Iterative Methods
Previous: 2.3.3 Gauss-Seidel Relaxation (GSR)
At each iteration step, compute the new vector
using GSR; then ``mix it'' with the previous vector
:
The ``relaxation parameter'' may be varied within the
range to optimize the method.
The complete iteration formula is
A single row in this system of equations reads
The rate of convergence of this procedure is governed by the matrix
The optimal value of is given by
yielding
The asymptotic rate of convergence is
EXAMPLE:
With the same data as before we find an optimal relaxation
parameter
,
and from that . The first two iterations yield
Chebyscheff Acceleration:
During the first few iterative steps the SOR procedure
may give rise to overshooting corrections - particularly if is
distinctly larger than 1.
Adjust on the fly:
Start out with , then approach .
Next: 2.3.5 Conjugate Gradient Method
Up: 2.3 Iterative Methods
Previous: 2.3.3 Gauss-Seidel Relaxation (GSR)
Franz J. Vesely Oct 2005
See also: "Computational Physics - An Introduction," Kluwer-Plenum 2001