next up previous contents
Next: Acknowedgement Up: 2 Iterative Methods for Previous: 2 Conjugate Gradient Algorithm   Contents

3 Multigrid Iteration

Central to a great deal of fluid mechanics is the Poisson equation $ \nabla^2w=f$ and in two or more dimensions it is common to obtain approximate solutions by iterative methods. If we just return to a 1-D model problem,

$\displaystyle w''=f,$ (195)

then for Jacobi iteration

$\displaystyle W_r^{(k+1)}={1\over 2}(W_{r+1}^{(k)}+W_{r-1}^{(k)})-{1\over 2}h^2f_r,$ (196)

so that the iteration matrix $ {\rm\bf J}$ satisfies

$\displaystyle ({\rm\bf J}{\rm\bf W})_r={1\over 2}(W_{r+1}+W_{r-1}),$ (197)

and if $ W_r=\sin \pi rkh,\ k=1,\ldots ,m-1$, then we see that the eigenvaues of $ {\rm\bf J}$ are just $ \cos \pi kh$ and the spectral radius is $ \cos\pi h\sim 1-\pi^2h^2/2$ so that as the mesh becomes finer, not only are there more calculations per iteration, the convergence rate becomes poorer. Since the poor convergence is associated mainly with low frequency errors one way which has been developed for accelerating convergence is to carry out some iterations on a coarser mesh where the convergence rate is better and then to transfer the solution back to the fine mesh.

Suppose two vectors are defined, $ {\rm\bf W}_h$ which approximates the values $ ({\rm\bf W}_h)_r\sim W(rh),\ r=0,\ldots ,m$, and $ {\rm\bf W}_{2h}$ with $ ({\rm\bf W}_{2h})_s\sim w(2sh),\ s=0,\ldots ,m/2$. Obviously assume $ m$ is even. Next define two matrix operators, one which restricts values on the fine mesh to the coarse mesh, symbolically

$\displaystyle {\rm\bf W}_{2h}={\rm\bf R}_{h}^{2h}{\rm\bf W}_h,$ (198)

and an interpolation operator which takes values from the coarse mesh to the fine mesh, again symbolically

$\displaystyle {\rm\bf W}_h={\rm\bf I}_{2h}^h{\rm\bf W}_{2h}.$ (199)

Multigrid then carries out a number of iterations on the fine mesh, damping out high frequency errors, transfers values to the coarse mesh and carries out further iterations on that mesh to damp out low frequency errors before interpolating values back to the original fine mesh. The overall sequence of iteration on fine mesh, restriction, iteration on the coarse mesh, interpolation and then further iteration on the fine mesh represents one cycle of multigrid iteration. In practice a number of meshes are used, not just two. Symbolically in the case of two meshes, if we carry out $ p$ Jacobi iterations on the fine mesh and $ q$ iterations on the coarse mesh,

$\displaystyle {\rm\bf W}_h^{(k+1)}={\rm\bf I}_{2h}^h({\rm\bf J}_{2h})^q{\rm\bf R}_h^{2h}({\rm\bf J}_h)^p{\rm\bf W}_h^{(k)}.$ (200)

A key result is that the spectral radius of the iteration matrix can be independent of $ h$. In practice the finest mesh is chosen with $ m=2^k$ and then it is possible to use up to $ k$ meshes, each half the resolution of the previous mesh and a multigrid cycle is a sequence of (either Jacobi or Gauss-Seidel) iterations on each of the meshes in succession, both after restricting from fine to coarse, and after interpolating from coarse to fine. It is also usual to work with residuals, so that if $ {\rm\bf W}_h$ is our current approximation and $ {\rm\bf W}_h+\delta{\rm\bf W}_h$ is a better approximation, then as

$\displaystyle L_h[{\rm\bf W}_h+\delta {\rm\bf W}_h]={\rm\bf f},$ (201)

we have

$\displaystyle L_h[\delta{\rm\bf W}_h]={\rm\bf f}-L_h{\rm\bf W}_h={\rm\bf r}_h,$ (202)

where $ {\rm\bf r}_h$ is the residual so that we iterate on the coarse mesh to solve

$\displaystyle {\rm\bf L}_{2h}[\delta{\rm\bf W}_{2h}]={\rm\bf R}_h^{2h}{\rm\bf r}_h,$ (203)

and then

$\displaystyle {\rm\bf W}_h\leftarrow {\rm\bf W}_h+{\rm\bf I}_{2h}^h[\delta {\rm\bf W}_{2h}].$ (204)


next up previous contents
Next: Acknowedgement Up: 2 Iterative Methods for Previous: 2 Conjugate Gradient Algorithm   Contents
Last changed 2000-11-21