next up previous contents
Next: 4 Some General Theory Up: 1 Introduction Previous: 2 Gauss-Seidel iteration   Contents

3 Relaxation

Suppose a scalar $ s$ is determined by $ s=G(s)$ and we iterated $ x^{(k+1)}=G(x^{(k)})$, then this could be written $ x^{(k+1)}=x^{(k)}+(G(x^{(k)})-x^{(k)})$, or $ x^{(k+1)}=x^{(k)}+\delta$ where $ \delta =G(x^{(k)})-x^{(k)}$. In many situations, over a number of iterations, $ \delta$ is consistently either too large or too small. So in some situations, it is possible to accelerate convergence by taking a multiple of $ \delta$ so that

$\displaystyle x^{(k+1)}=x^{(k)}+\omega \delta = (1-\omega )x^{(k)}+\omega G(x^{(k)}).$ (139)

This is the basis of relaxation; if $ \omega <1$ we speak of under-relaxation, if $ \omega >1$ it is called over-relaxation. If we apply this idea to Jacobi we would have

$\displaystyle {\rm\bf x}^{(k+1)}=(1-\omega ){\rm\bf x}^{(k)}+\omega {\rm\bf J}{\rm\bf x}^{(k)}+\omega {\rm\bf D}^{-1}{\rm\bf f},$ (140)

so the iteration matrix would be

$\displaystyle {\mathcal J}(\omega )=(1-\omega ){\rm\bf I}+\omega {\rm\bf J}={\rm\bf I}-\omega{\rm\bf D}^{-1}{\rm\bf A}.$ (141)

It is more common to combine relaxation with a form of Gauss-Seidel. If we use

$\displaystyle {\rm\bf x}^{(k+1)}={\rm\bf L}{\rm\bf x}^{(k+1)}+{\rm\bf U}{\rm\bf x}^{(k)}+{\rm\bf D}^{-1}{\rm\bf f},$ (142)

to generate an increment $ \delta$ as in the scalar case, we will obtain

$\displaystyle {\rm\bf x}^{(k+1)}=(1-\omega ){\rm\bf x}^{(k)}+\omega ({\rm\bf L}...
...{(k+1)}+{\rm\bf U}{\rm\bf x}^{(k)}+{\rm\bf D}^{-1}{\rm\bf f}-{\rm\bf x}^{(k)}),$ (143)

so that

$\displaystyle {\rm\bf x}^{(k+1)}=(I-\omega {\rm\bf L})^{-1}[(1-\omega ){\rm\bf I}+\omega{\rm\bf U}]{\rm\bf x}^{(k)},$ (144)

and the iteration matrix is

$\displaystyle {\mathcal H}(\omega )=(I-\omega {\rm\bf L})^{-1}[(1-\omega ){\rm\bf I}+\omega{\rm\bf U}].$ (145)


next up previous contents
Next: 4 Some General Theory Up: 1 Introduction Previous: 2 Gauss-Seidel iteration   Contents
Last changed 2000-11-21