Abstract

The performance of the multigrid method and the effect of different grid levels on the convergence rate are evaluated. The two-, three-, and four-level V-cycle multigrid methods with the Gauss-Seidel iterative solver are employed for this purpose. The numerical solution of the one-dimensional Laplace equation with the Dirichlet boundary conditions is obtained using these methods. For the Laplace equation, a two-frequency function involving high- and low-frequency components is defined. It is observed that, however, the GS method can smooth out the high-frequency error components properly, but because the difference scheme for Laplace equation is remarkably concise, in the fine grids, a very large number of iterations are needed for extending the boundary conditions into the domain. Furthermore, the obtained results reveal that the number of necessary iterations for convergence is reduced considerably by employing the two-level multigrid algorithm. But increasing the number of levels of algorithm does not have any significant effect on the convergence rate in this study.

1. Introduction

The standard iterative methods like Jacobi and Gauss-Seidel (GS) rapidly damp out the local errors (high-frequency errors) of the solution, but they are extremely slow to remove the global errors (low-frequency errors) [1, 2]. In fact, these methods have a local stencil and may require a large number of iterations to converge. The multigrid method (MG) is one of the most efficient methods for solving linear and nonlinear systems, which can speed up the rate of damping out low-frequency errors. In this method, the high-frequency components of the solution error are damped by an iterative solver, or smoother, on a fine grid, whereas the low-frequency components are transferred to the coarser grid. On the coarser grid, these low-frequency error components appear as high-frequency ones, which are iteratively solved by a smoother. The typical application of the multigrid method is the numerical solution of elliptic partial differential equations [3]. The multigrid methods have also been used successfully for problems in image processing and vision [4].

In the past decades, many researchers including Fedorenko, Bakhvalov, and Brandt have studied and developed the multigrid methods [510]. The multigrid idea was first introduced by Fedorenko in 1962 and 1964 [5, 6] and then generalized by Bakhvalov in 1966 [7]. The multigrid algorithms were developed to practical applicability by Brandt in 1973 [8]. In 1977, Brandt [9] introduced a multilevel adaptive technique (MLAT) for fast numerical solution to the boundary value problems. He developed a distributive Gauss-Seidel (DGS) method as a smoother for solving the Navier-Stokes equations [10]. Jameson (1983 and 1985) developed the multigrid scheme to efficiently solve hyperbolic equations [11, 12].

There are many recent studies that have employed the multigrid method to increase the convergence rate. For example, Vazquez et al. [13] presented a methodology, equipped with multigrid techniques, to speed up the convergence of a fully implicit solver for the RANS equations for incompressible flows. Bruneau and Mortazavi [14] investigated the flow around a square cylinder. They used a multigrid method with Gauss-Seidel smoother for solution of the problem. In their work, the set of grids was varied, based on the needed accuracy, from the coarsest 20×5 uniform grid to the finest 640×160 or 1280×320 uniform grid. Liang et al. [15] investigated a p-multigrid method for solving spectral difference formulations of the scalar wave and Euler equations on unstructured grids. They used a lower-upper symmetric Gauss-Seidel (LU-SGS) method as an iterative smoother. They also employed a Runge-Kutta explicit method for comparison. Liang et al. [16] developed a two-dimensional high-order solver with spectral difference scheme for unsteady incompressible Navier-Stokes equations. They used a p-multigrid method to accelerate the convergence rate.

In the present work, in order to evaluate the capability of the multigrid method and the effect of different grid levels on the convergence rate, the two-, three-, and four-level V-cycle multigrid algorithms with Gauss-Seidel iterative method as a smoother are studied. A two-frequency function, with high and low frequencies, is defined for the Laplace equation. The convergence histories for different cases are compared and discussed.

2. Problem Statement

The one-dimensional Laplace equation and its discretized form (three-point discretization) can be expressed as in (1) and (2). The Dirichlet boundary conditions are set at the boundaries (𝑇(0)=0 and 𝑇(50.3)=0). 504 nodes with distance Δ𝑥=0.1 are placed in the solution domain𝜕2𝑇𝜕𝑥2𝑇=0,0𝑥50.3,(1)𝑖𝑛+1=12𝑇𝑛+1𝑖1+𝑇𝑛𝑖+1.(2) Initial distribution of function 𝑇 involving high- and low-frequency components is defined in the solution domain, which is presented in (3). According to this equation, two different frequencies including 𝑓10.04(𝑇1=25.15) and 𝑓2=0.318(𝑇2=𝜋) exist. The function 𝑇 is plotted in Figure 1, and these two frequencies are distinguished in this figure.

At first, the Laplace equation is solved directly using the GS method. Then, it is solved by using the GS method equipped with the two-, three-, and four-level multigrid algorithms.𝑇𝑇(𝑥)=0.01𝑥+0.05sin(2𝑥),0<𝑥4𝜋,(𝑥)=0.08𝜋0.01𝑥0.05sin(2𝑥+𝜋),4𝜋<𝑥8,𝑇(𝑥)=0.08𝜋+0.01𝑥+0.05sin(2𝑥),8𝜋<𝑥12𝜋,𝑇(𝑥)=0.16𝜋0.01𝑥+0.05sin(2𝑥),12𝜋<𝑥<50.3.(3)

3. Four-Level Multigrid Method (MG4) with Gauss-Seidel Smoother (GS)

A four-level multigrid V-cycle and data transition between different grid levels are shown schematically in Figure 2. The levels 1 and 4 represent the finest and coarsest grids, respectively. In the multigrid method, the coarse grid mesh size is typically twice larger than the fine grid one. In the current study, the distance between grid points (mesh size) is doubled at each level (Δ𝑥, 2Δ𝑥, 4Δ𝑥, and 8Δ𝑥). It should be noted that the coarser grids must be coarse enough to make the solution inexpensive compared to their previous grid levels. In the following sections, the procedure of going from finer to coarser level (restriction) and transferring corrections from coarser to finer level (prolongation) is explained in detail.

3.1. Smoothing Iterations on the Finest Grid and Transfer the Residual to the Coarser Grid

The Laplace equation is solved using the GS method (2). The number of relaxation sweeps on the finest grid is about 3 to 4. It should be noted that complete convergence is not absolutely necessary. The operator 𝐋 is defined such that 𝐋𝑇𝑖 becomes the discretized form of Laplace equation, as presented in (4). The residuals are obtained using (5) and then transferred to the coarser grid (level 2). G1 denotes the level 1 in this equation. In order to transfer data to the coarser grids, the injection operator (restriction operator) is employed, which projects the data of the finer grid points to the corresponding points at the coarser grid (see Figure 2(b))𝐋𝑇𝑖=𝑇𝑛𝑖+12𝑇𝑛𝑖+𝑇𝑛𝑖1(Δ𝑥)2,𝐑(4)𝑖||G1=𝐋𝑇𝑖.(5)

3.2. Computing the Correction Term and New Residuals

The correction term can be calculated using (6), where G2 refers to the level 2. This equation can be expanded to yield (7), where Δ𝑥 is two times of Δ𝑥. In this equation, an initial guess of zero is used for 𝜏. Similarly, 3 or 4 iterations of the solver are adequate at this stage𝐋𝜏𝑖||G2=𝐑𝑖||G2,𝜏(6)𝑖𝑛+1||G2=0.5𝐑𝑖||G2×Δ𝑥2+𝜏𝑛𝑖+1||G2+𝜏𝑛+1𝑖1||G2.(7) The updated residuals are obtained as the sum of the old residuals and the correction term as follows:𝐑𝑖||G2,New=𝐑𝑖||G2+𝜏𝑖||G2.(8) Similarly, in levels 3 and 4, the new residuals and the correction terms can be calculated. It should be noted that Δ𝑥 is 4 and 8 times of Δ𝑥 in levels 3 and 4, respectively.

3.3. Transferring Corrections to the Finer Grid

By using a linear interpolation, the correction term obtained in level 4 is transferred back to level 3 (𝜏𝑖|G3,Return) and added to the correction term calculated in this level in the restriction process (9). The obtained values(𝜏𝑖|G3,New) are employed as an initial guess for solving Laplace equation in this level (10). In (10), 𝐑𝑖|G3,New is the updated residuals obtained in the restriction process𝜏𝑖||G3,New=𝜏𝑖||G3+𝜏𝑖||G3,Return,(9)𝐋𝜏𝑖||G3=𝐑𝑖||G3,New.(10) Similarly, the correction term can be transferred back to level 2. To return to level 1 (finest grid), the correction term obtained in level 2 is transferred to level 1 (𝜏𝑖|G1,Return) and added to the function 𝑇 calculated by (2), as follows:𝑇𝑖||New=𝑇𝑖||Old+𝜏𝑖||G1,Return.(11) The convergence criterion is checked after updating the function 𝑇. If the desired convergence is not achieved, the entire procedure (Sections 3.1 to 3.3) must be repeated. It should be noted that the updated 𝑇 in the last step (11) is considered as an initial guess for new operation (2). In the two- and three-level multigrid algorithms, the prolongation process is started after reaching the levels 2 and 3, respectively.

4. Numerical Results

Figures 3 and 4 show the smoothing of the function 𝑇 using Gauss-Seidel (GS) and multigrid (MG) methods after 𝑛 iterations. It required 113704 and 5685 iterations for GS and MG methods, respectively, to reach the convergence criterion of 1×103. As can be seen in Figure 3, for Gauss-Seidel method, the high-frequency error components are damped completely after 300 iterations, whereas 113704 iterations (about 380 times more) are needed to remove the low-frequency components. On the other hand, as soon as the high-frequency components are smoothed out, the convergence slows down. According to the obtained results, the number of iterations required to smooth out the global errors can be reduced to 5685 by applying the two-level multigrid method. The numbers of iterations for convergence in GS and MG method are presented in Table 1.

The convergence history of the GS and MG methods is plotted in Figures 5 and 6. As can be observed in Figure 5, for the GS method, after the initial iterations, the residuals decrease very slowly. In addition, the slope of the residuals decreases gradually and approaches zero, indicating that the GS method is very time consuming especially for obtaining the exact solution. But for the two-level MG method, the residuals drop rapidly with a relatively constant slope compared to the GS method. Furthermore, according to Figure 6, the trend of residual variations is significantly similar for the two-, three-, and four-level MG methods. It seems that using the two-level MG method is adequate to speed up the convergence in this study.

5. Conclusion

In this study, the capability of the multigrid method in increasing the convergence rate has been evaluated by using the two-, three-, and four-level V-cycle multigrid algorithms (MG2, MG3, and MG4, resp.) and the iterative Gauss-Seidel method (GS). The obtained results confirm that the MG method accelerates the convergence of the solution drastically. For example, the number of iterations required to reach the convergence criterion of 1×103 decreases from 113704 to 5685 by applying an MG2 method. It is found that the GS method is very time consuming to achieve high accuracy. Furthermore, the comparison of the convergence history of MG2, MG3, and MG4 methods reveals that applying the MG2 method is adequate to increase the convergence speed in this problem. In fact, the choice of the coarsest grid level is problem dependent, and the frequency of errors plays an important role in it.