Next: 2.3.5 Conjugate Gradient Method Up: 2.3 Iterative Methods Previous: 2.3.3 Gauss-Seidel Relaxation (GSR)

## 2.3.4 Successive Over-Relaxation (SOR)

At each iteration step, compute the new vector using GSR; then mix it'' with the previous vector :

The relaxation parameter'' may be varied within the range to optimize the method.

The complete iteration formula is

A single row in this system of equations reads

The rate of convergence of this procedure is governed by the matrix

The optimal value of is given by

yielding

The asymptotic rate of convergence is

EXAMPLE: With the same data as before we find an optimal relaxation parameter , and from that . The first two iterations yield

Chebyscheff Acceleration:

During the first few iterative steps the SOR procedure may give rise to overshooting corrections - particularly if is distinctly larger than 1. Adjust on the fly: Start out with , then approach .
• Split the solution vector in even and odd elements: , ; do the same with .
• The two subvectors and are iterated in alternating succession, with the relaxation parameter being adjusted according to

Next: 2.3.5 Conjugate Gradient Method Up: 2.3 Iterative Methods Previous: 2.3.3 Gauss-Seidel Relaxation (GSR)
Franz J. Vesely Oct 2005