Abstract

We investigate the solution of large linear systems of saddle point type with singular block by preconditioned iterative methods and consider two parameterized block triangular preconditioners used with Krylov subspace methods which have the attractive property of improved eigenvalue clustering with increased ill-conditioning of the block of the saddle point matrix, including the choice of the parameter. Meanwhile, we analyze the spectral characteristics of two preconditioners and give the optimal parameter in practice. Numerical experiments that validate the analysis are presented.

1. Introduction

We study preconditioners for general nonsingular linear systems of the type Such systems arise in a large number of applications, for example, the (linearized) Navier-Stokes equations and other physical problems with conservation laws as well as constrained optimization problems [15].

As such systems are typically large and sparse, solution by iterative methods has been studied extensively [1, 516]. Much attention has focused on the Navier-Stokes problem; see, for example, [1, 2, 5, 17, 18]. The techniques for solving systems like (1) are so numerous that it is almost impossible to give an overview. In addition to the methods developed specifically for Navier-Stokes problems, existing techniques also include splitting schemes [2, 6, 12, 19, 20], constraint preconditioning [2, 20, 21], Uzawa-type algorithms [2, 12, 22], and (preconditioned) Krylov subspace methods based on (approximations to) the Schur complement [2, 16, 23].

We start with augmentation block triangular preconditioners for the general system (1); see Section 2 for our assumptions. When is nonsingular, results for the general system have been obtained before; for example, Murphy et al. [23, 24] propose the block diagonal Schur complement preconditioner and the block triangular Schur complement preconditioner as follows: If defined, it has been shown that the preconditioned matrices (cf. [24]) are diagonalizable and have only three distinct eigenvalues , and two distinct eigenvalues , , respectively.

However, when is singular, it cannot be inverted and the Schur complement does not exist. For symmetric saddle point systems, that is, and is symmetric, one possible way of dealing with the systems is by augmentation, that is, by replacing with , where is an symmetric positive definite weight matrix [4, 7, 9, 14, 17, 18, 2527]. Recently, for symmetric saddle point systems with block that has a high nullity, Greif and Schötzau [25, 26] studied the application of the following block diagonal preconditioner used with the MINRES solver for the nonsymmetric saddle point systems (1): They have shown that if the nullity of is , which is the highest possible nullity, then the preconditioned matrix has only two distinct eigenvalues and . Thus, a preconditioned minimal residual Krylov subspace iterative method such as MINRES converges within two iterations.

Recently, Rees and Greif [4] presented a block triangular preconditioner as follows: and has shown that if , the preconditioned matrices have with multiplicity and each with multiplicity . The remaining eigenvalues lie in the intervals . For general nonsymmetric saddle point problems with nonsingular block, some block structured preconditioning approaches are also available [3, 4, 18, 22, 2833].

In this paper, we propose two new block triangular preconditioners where is a scalar, and is an symmetric positive definite weight matrix.

The remainder of this paper is organized as follows. In Section 2, we discuss two block triangular preconditioners for solving nonsymmetric saddle point systems, and algebraic properties are studied too. In Section 3, numerical experiments are provided to validate our analysis in Section 2. In Section 4, we draw some conclusions.

2. Block Triangular Preconditioners

We will adopt the general notation to represent the nonsymmetric saddle point matrix of (1). We assume that is symmetric and positive semidefinite with nullity and that is of size and has full row rank. Note that the assumption that is nonsingular implies that , which we use in our analysis below.

We next give two lemmas in [2, 27].

Lemma 1 (see [2, 27]). The nonsymmetric saddle point matrix (7) is nonsingular if and only if the following conditions are satisfied: where and are bases of and , respectively. Obviously, the nonsingularity of implies

Lemma 2 (see [2, 27]). If the saddle point-type matrix in (7) is nonsingular, then the rank of the matrix is at least , and hence its nullity is at most .

2.1. Block Triangular Preconditioners

We first consider the preconditioner It is easy to see that the eigenvalues of the preconditioned matrix satisfy the generalized eigenvalue problem The second block row gives , and substituting it into the first block equation gives Regardless of the choice of , we see that with algebraic multiplicity . From the nullity of it follows that there are linearly independent null vectors of . For each such null vector, we can find two values satisfying . Thus, we have each with algebraic multiplicity . The remaining eigenvalues satisfy as following: Therefore, we rewrite (16) as where . Thus, we have Since is assumed to be nonsingular, the matrix pencil is regular (cf. [27]). This expression gives an explicit formula in terms of the generalized eigenvalues of (16) and can be used to identify the intervals in which the eigenvalues lie.

To illustrate this, we consider the case , which corresponds to setting the block of the preconditioner to be . The preconditioned matrix has with multiplicity , and each with multiplicity . By (18) we have It is worth noting that since is typically highly singular, many of the generalized eigenvalues are large, in which case the corresponding eigenvalues are bounded away from zero. Since the eigenvalues are unbounded as goes to , we conclude that should be of moderate size.

2.2. Block Triangular Preconditioner

We next consider the preconditioner where is a scale, is an symmetric positive definite weight matrix, and is nonsingular.

The following theorem provides the spectrum results of the preconditioned matrix .

Theorem 3. Let be nonsingular and let its block be singular with nullity . Then is an eigenvalue of of geometric multiplicity , and is an eigenvalue of geometric multiplicity . The remaining eigenvalues satisfy the relation where are some generalized eigenvalues of the following generalized eigenvalue problem:

Proof. Suppose that is an eigenvalue of , whose eigenvector is . Then we have Furthermore, it satisfies the generalized eigenvalue problem or As is nonsingular, . The second equality gives that , and substituting it into the first equality gives It is straightforward to see that any vector satisfies (26) with , and thus is an eigenvalue of . We claim that the eigenvalue has geometric multiplicity .
If , then from (26) we have If , then from (27) we have From Lemma 1 and (28), is singular with nullity , we know that is an eigenvalue of of geometric multiplicity .
We have determined eigenvalues. Now we consider the remaining eigenvalues of .
Suppose that , then from (27) we have where , which implies that . Since is assumed to be nonsingular, the matrix pencil is regular (cf. [27]). Thus, the generalized eigenvalue problem (29) is well posed.

If , then we have the following corollary.

Corollary 4. Let be nonsingular and let its block be singular with nullity . Then is an eigenvalue of of geometric multiplicity , and is an eigenvalue of geometric multiplicity . The remaining eigenvalues satisfy the relation where are some generalized eigenvalues of the following generalized eigenvalue problem:

From Theorem 3 and Corollary 4 we know that the higher the nullity of is, the stronger the clustered eigenvalues of are. When the nullity of is , its at most value (cf. Lemma 2), we have the following result.

Corollary 5. Let be nonsingular and let its block be singular with nullity . Then the preconditioned matrix is diagonalizable and has precisely two eigenvalues of geometric multiplicity and of geometric multiplicity .

If , we have the following corollary.

Corollary 6. Let be nonsingular and let its block be singular with nullity . Then is an eigenvalue of of geometric multiplicity . The remaining eigenvalues satisfy the relation where are some generalized eigenvalues of the following generalized eigenvalue problem:

From Theorem 3 and Corollary 6, if the nullity of is , we have the following corollary.

Corollary 7. Let be nonsingular and let its block be singular with nullity . Then the preconditioned matrix has only precisely one eigenvalue of geometric multiplicity .

Remark 8. From Corollaries 5 and 7, if the nullity of is , then it is readily seen that the preconditioned matrix has precisely two distinct eigenvalues and that the preconditioned matrix has only precisely one eigenvalue. Thus, we know that any preconditioned Krylov subspace method such as GMRES terminates in at most two steps if roundoff errors are ignored.

3. Numerical Experiments

In this section we present numerical experiments to illustrate the performance of the two preconditioners when they are implemented exactly or approximately.

For our experiments, as in [19, 27] we construct the saddle point-type matrix from reforming a matrix of the following form: where the matrix arises from the discretization of the Stokes problem as follows: where , is the boundary of , is the componentwise Laplace operator, is a vector-valued function representing the velocity, and is a scalar function representing the pressure (cf [19]). Then the matrices and are replaced by a random matrix with the same sparsity as and a random matrix with the same sparsity as , respectively. Furthermore, and are, respectively, replaced by and , such that and are nonsingular. Denote and ; then we have and with and . Obviously, the resulting saddle point-type matrix satisfies From the matrix in (36) we construct the following four saddle point-type matrices: where is constructed from by making its rows and columns with zero entries. Note that is semipositive real and its nullity is .

In the first numerical example, the matrix in the augmentation block preconditioners is taken as , whereas the positive parameter is taken as (cf. [17])

Theorem 3 and Corollaries 47 guarantee increased spectral clustering of the preconditioned matrix when . As expected by the previous theorems and corollaries, Figures 1 and 2 plot, respectively, the two preconditioned matrices, each figure includes four subfigures that plot, respectively, the eigenvalues of four preconditioned matrices resulting from four matrices , , operated upon by exactly the same preconditioner.

From these figures we can clearly see that the higher the nullity of the block, the more strongly the eigenvalues of the preconditioned matrices are clustered. Furthermore, we note that Figures 1 and 2 show that has very high multiplicity when the nullity of the block is .

Next, we study the iteration numbers and iteration time for and . Table 1 shows results for applying GMRES of block triangular preconditioners . From Table 1, we can see that the preconditioned GMRES of is more efficient when , and iteration numbers are slightly changed by the change of parameter when the nullity of is (see Tables 1 and 2).

We next will use preconditioned GMRES to solve, respectively, the following four saddle point-type systems: where the right-hand side is taken such that the exact solution is equal to . The stopping criterion is and is the residual vector after th iteration. In this section, let ; we give some comparison results of the iteration numbers and iteration time for five preconditioners (see Tables 2 and 3). From Table 2, when the nullity of is , we see that the preconditioned GMRES with the preconditioner is more efficient than that of the preconditioner , but from Table 3, when the nullity of is , we see that the preconditioned GMRES with the preconditioner is essentially similar to the preconditioner in iteration time and iteration numbers.

All the numerical experiments were performed with MATLAB 7.0. The machine we have used is a PC-AMD, CPU T7400 2.2 GHz process.

4. Conclusions

In this paper, we have analyzed the spectral properties as well as the computational performance of two types of block triangular augmentation preconditioners for saddle point-type matrices with a highly singular block. Complete theoretical analysis shows that all eigenvalues of the preconditioned matrices are strongly clustered. A good parameter choice may substantially reduce the iteration numbers and iteration time. In particulare, we have shown that in cases where the block has high nullity, convergence for each of the two preconditioned GMRES iterative methods is guaranteed to be almost immediate. Numerical experiments are also reported for illustrating the efficiency of the presented preconditioners. From the views of theories and applications, the presented preconditioners are better than the previous results.

Acknowledgments

The author wishes to thank the editor Professor Kamal Mostafa and the anonymous referees for their helpful suggestions which improved this paper. This research was supported by the National Natural Science Foundation of China (no. 11071079), the Ningbo Natural Science Foundation (2012A610037) and the Fundamental Research Funds for the Central Universities.