It is well known that the flops for complex operations are usually 4 times of real cases. In the paper, using real operations instead of complex, a real fast structure-preserving algorithm for eigenproblem of complex Hermitian matrices is given. We make use of the real symmetric and skew-Hamiltonian structure transformed by Wilkinson's way, focus on symplectic orthogonal similarity transformations and their structure-preserving property, and then reduce it into a two-by-two block tridiagonal symmetric matrix. Finally a real algorithm can be quickly obtained for eigenvalue problems of the original Hermitian matrix. Numerical experiments show that the fast algorithm can solve real complex Hermitian matrix efficiently, stably, and with high precision.

1. Introduction

We consider eigenvalue problems of the complex Hermitian matrix of the form where is a real scalar and is a complex Hermitian matrix of the form where is a real symmetric matrix; that is, and is a real skew-symmetric nonzero matrix, that is, , given by and can be written as , where , .

We know that the property of the complex Hermitian matrix is also possessed by the symmetry matrix; therefore little work has been done on complex Hermitian matrix. However, from the view of numerical computation, the biggest difference between these two types of matrices is a complex Hermitian matrix that involves complex operations, while the latter does not. It is well known that a complex Hermitian matrix is unitary similar to a real diagonal matrix; that is, it has real eigenvalue but complex eigenvector. Because the flops for the complex operation are usually 4 times of the real case, so how to reduce the computation cost of the eigenvalue problem of complex Hermitian matrices is very meaningful and worthy of research.

In 1965, the famous numerical analysis expert Wilkinson [1] gave a way to transform a complex Hermitian matrix into real case. To be more precise, if is the eigenpair of , then

Since the eigenvalues of are real, the previous equation can be written as

So the eigenproblem (1) is transformed into the eigenproblem of the real symmetry matrix

Note that the matrix in (6) is not only real symmetry but also skew-Hamiltonian. Therefore, in the following discussion, inspired by the algorithms for the eigenproblem of Hamiltonian (skew-Hamiltonian) matrix [211], we devise an algorithm by taking full advantage of the special structure of (6).

In this paper we first prove that orthogonal symplectic similarity transformations not only preserve the symmetry and skew-Hamiltonian structure of (6), but also eigenvectors of (6), since the eigenvectors of (6) have special structure; that is, it is also orthogonal symplectic. Therefore, by these transformations, a real stable, accurate, and fast method is devised to reduce (6) into a block symmetry tridiagonal matrix, and then we get the eigenproblem of the original Hermitian matrix. When implementing the method, although its dimension doubles the one of , we can make full use of the special structure of to avoid superfluous calculation and storage.

The remainder of the paper is organized as follows. In Section 2, we show the basic orthogonal symplectic matrix and its algorithm. In Section 3, we present the overall real algorithm for eigenproblem of complex Hermitian matrix. Some implementation aspects are discussed in Section 4. Finally, in Section 5, the results of numerical experiments are discussed to end the paper.

2. Basic Orthogonal Symplectic Matrix and Its Algorithm

It is well known that by choosing a proper unitary vector , Householder transformation can zero some entries of a given vector; likewise, we defined Householder symplectic transformation which has the following form.

Definition 1. A real matrix is a symplectic matrix if , where is the transpose of and is the orthogonal matrix: Here is the identity matrix and is the zero matrix.

Definition 2. Letting , a matrix is called Householder symplectic matrix, if it has the form where Obviously, is Householder transformation of order , when , .

Note that Householder symplectic matrix is just a direct sum of two “ordinary” -by- Householder matrices, specially when , . So based on the algorithm of “ordinary” Householder transformation, we obtain the following algorithm for Householder symplectic transformation of order .

Algorithm 3 (Householder symplectic transformation [8]). Given and , the following algorithm determines , such that if then : end.

is completely determined by to components of vector ; it makes satisfy for . Note that by interchanging the roles of and , the previous algorithm can be used to determine such that for .

Householder symplectic transformation can be used to zero large portions of a vector as Householder transformation; likewise, we can define Givens symplectic transformation, which can be used to zero single entries.

Definition 4. Given , a matrix is called Givens symplectic matrix, if it has the form where
is an “ordinary” 2-by-2 Givens rotation that rotates in planes and . The corresponding algorithm is as follows.

Algorithm 5 (Givens symplectic transformation [8]). Given and , the following algorithm determines , such that if then :

3. The Reduction of the Symmetry and Skew-Hamiltonian Matrix

Now we consider some properties of the symmetry and skew-Hamiltonian matrix of (6).

Lemma 6. If is the eigenpair of then is also its eigenpair.

Proof. If then we have

According to Lemma 6, there are two different eigenvectors corresponding to the same eigenvalue, so it can be easy to prove that where is the set of eigenvalues of (6) and is the eigenvectors of (6). Note that the matrix in (20) is symplectic orthogonal. The following lemma tells us that the product of symplectic orthogonal matrices is also symplectic orthogonal.

Lemma 7. Supposing that -by- real matrices of the form are both orthogonal and symplectic, then their product is both orthogonal and symplectic.

Lemma 7 shows that the special structure of eigenvectors of can be preserved by the symplectic orthogonal similarity transformations. The following theorem is the foundation of the fast and stable structure-preserving algorithm given in the following discussion.

Theorem 8. Suppose , and is an orthogonal symplectic matrix. If satisfy then

Proof. It is easy to obtain Then
Moreover , hence we have

The previous theorem shows that the symplectic similarity transformation preserves the symmetry and skew-Hamiltonian structure of

This structure-preserving property provides a way to transform the matrix into a special skew-Hamiltonian matrix: where is a real symmetry tridiagonal matrix. So we have the following results, which provide a theoretical basis for the real fast structure-preserving algorithm for eigenproblems of a Hermitian matrix.

Theorem 9. Letting , where , then there exists an orthogonal symplectic matrix such that where is a symmetric tridiagonal matrix. Suppose that has the eigenvalue decomposition that is Then eigenvalues of are and eigenvectors are the columns of

We now consider how an orthogonal symplectic can be determined to reduce () to where is a symmetry tridiagonal matrix. It is worth illustrating a few steps of the algorithm before presenting it in full generality. We depict an -by- symmetry and skew-Hamiltonian matrix () as follows: The zero diagonals in follow from skew symmetry. It is easy to verify that the (2, 1) and (1, 2) block of this matrix can be zeroed by applying a sequence of Householder and Givens symplectic similarity transformations.

The first step is to zero and using a Householder symplectic . This matrix can be determined by executing Algorithm 3 with , , and . Note that after performing, only the rows and columns , , and of , , and are affected, the result being Then and positions of are zeroed; meanwhile the same positions of are zeroed because Householder symplectic similarity transformation preserves the structure of .

The next step is to zero using Givens similarity , which can be achieved by applying Algorithm 5 with , , and . And notice that only the second row and column of , and be affected. This implies that

Next we compute a Householder symplectic to zero and , which can be achieved by applying Algorithm 3 with , , and . Denote , the result being

Note that the and positions of are zeroed. This completes the zeroing in the first column of . Likewise we can zero the th of with the property that

Notice that only the and positions of are nonzero; they are zeroed only by applying Givens symplectic similarity transformation to and we get

This completes the zeros of with . The similar method can be applied to zero the general matrix of order . Overall we have the following procedure “Reduction to block symmetry tridiagonal matrix.” Given matrix

The following algorithm updates to the symmetry tridiagonal matrix ; meanwhile and are reduced into an zero matrix; moreover we have .

Algorithm 10 (reduction to block symmetry tridiagonal matrix). For do:
if do:
(1)  Apply Algorithm 3 with and to determine . Update is as follows:
(2)  Apply Algorithm 5 with and to determine .  Updates are as follows:
(3)  Apply Algorithm 3 with , to determine . Updates are as follows:
If do the folowing.
(4) Apply Algorithm 5 with and to determine . Update is as follows:

After completion, we get the block symmetry tridiagonal matrix , where is a symmetry tridiagonal matrix. The orthogonal matrix is given by

The method of reduction into block symmetry tridiagonal matrix is given by Algorithm 10; in the former steps, each step we premultiply by one Givens symplectic similarity and two Householder symplectic similarity and by postmultiply their transpose. Finally in the step, only the Givens symplectic similarity is done. According to (48), it is clear that is the product of orthogonal matrices, which is backward stability.

Algorithm 11 (the overall algorithm). (1) Transform Hermitian matrix into the symmetry and skew-Hamiltonian matrix:
(2)  Apply Algorithm 10 to reduce into a block symmetry tridiagonal matrix:
(3)  Compute eigenpairs (, ) of via .
(4)  Setting then is eigenvalue of , and is the eigenvector of .

4. Implementation Aspects

When implementing Algorithm 10, the matrix must be taken full advantage of in order to avoid superfluous calculation and storage. In this section there are two parts; the first is to consider the computational cost and storage for the reduction to this block symmetry tridiagonal matrix, and the second is the computational cost and storage of the eigenvector .

Due to the special structure and , shared by and , only workspace is required for matrix

Now we consider the computation cost for reducing to the block symmetry tridiagonal form, for Householder symplectic similarity, let then

In th step, updating to is equal to two ordinary Householder similarities for matrix, one for symmetry matrix and one for skew-symmetry matrix , so we only need to compute , , , and . Likewise, we can update to by the same way.

Algorithm 12 (Householder symplectic similarity). end.

Updating by Givens symplectic similarity is equally simple; setting ,   then

Denoting , then we have

Note that updating into such that affects the rows  (resp. the columns) , of , that is, the rows(columns) , of and . According to (58), we only need to compute , , and because is also symmetry and skew-Hamiltonian matrix. Therefore, only flops are required in th step, altogether flops. Bellow is the algorithm for in th step.

Algorithm 13 (Givens symplectic similarity). Consider

Overall flops are required for reducing to .

This reduction includes 4 times computational cost of Householder tridiagonalized for symmetry matrix and flops for Givens similarity transformation. Since one complex operation is four times of real operation, flops are required for complex Hermitian tridiagonalization.

Now we consider the computation and storage problem for eigenvectors

To obtain , we can premultiply only by matrices by using the special structure of , , and though they are -by- matrices. When premultiplying by Householder symplectics , let a matrix be of the form

So two workspace is required for , one for and one for . Premultiplying by in step, we have Therefore, only , and the real part of are updated to , and also imaginary part , updated to , . The main work is computing the product of Householder matrix and [] , and the product of and [] , where the first columns of and are zero. So flops are required each step, altogether flops. Likewise premultiplying by requires other flops.

Algorithm 14 (computation of ). For end.

For Givens symplectic matrix , set

Premultiplying by , denoting , we have Therefore, each step updating the th and th row of requires flops, altogether flops. Below is the algorithm.

Algorithm 15 (computation of ). For end.
Overall in each step we only need to premultiply two Householder matrix and one Givens matrix in the procedure of computing the eigenvector.

5. Numerical Experiments

All codes were run in MATLAB 7.04 in double precision and all experiments were performed on a personal computer with 1.41 Ghz central processing unit (AMD Sempron(tm) Processor 2500+), 520 MB memory, and Windows XP system. Machine epsilon .

Example 16. To illustrate Algorithm 11 more clearly, we give a simple example and show the processes and results of Algorithm 11 of :
We first transform into symmetry and skew-Hamiltonian matrix : Then where
Denoting that is the eigenpair of then is the first half of eigenvectors of . The computing eigenproblem of is showed in Table 1.
According to Tables 1 and 2, it can be seen that the eigenvectors are different except the one corresponding to 1. In fact, by Table 1, the eigenvector corresponding to 7.4031 is , and by Table 2. It is easy to prove that , that is, . So the complex eigenvector after normalizing is not unique, which is the biggest difference from real eigenvector.

Example 17. Now we give a little bigger matrix ; we depict its eigenproblem in Tables 3 and 4, respectively:
According to tables, we can see that the eigenvectors are complex and it is easy to prove that the two eigenvectors corresponding to the same eigenvalue are linear dependent, so the eigenvectors, which are computed by the real fast algorithm (Algorithm 11) span the same space as complex algorithm (eig).

Example 18. In this example we construct complex Hermitian matrices of the form where . In tables we only give a few eigenvalues with largest modulus and compare it with complex algorithm (eig). Let , ; that is, the order of matrix is 20, 200, respectively. The eigenvalues and errors, which are computed respectively by real fast algorithm and complex algorithm (eig), are showed in Tables 5 and 6. All the eigenvalues of matrix of order 20 are given, while only the th largest modulus eigenvalues of matrix order 200 are present. It is easy to prove that eigenvectors corresponding to the same eigenvalue are linear dependent. According to the tables, we have the same result as the complex algorithm (eig), but only the real operation is used.

Example 19. Complex Hermitian matrices are constructed by Hilbert matrix, and the MATLAB code is as follows: When , the 6 largest modulus eigenvalues are present and 3 eigenvectors corresponding to the 3 largest modulus eigenvalues are given here:
We prove that for . It is very interesting that the eigenvectors computed from our algorithm are unique but not the complex algorithm (eig).

Example 20. At last we construct some rand Hermitian matrices of order ; its MATLAB code is as follows.
Letting , denote
All the relative errors we compute are present by a factor of order . The average relative errors for all examples of each dimension we computed for each example in Table 7 are denoted by average (relerr). The max, min, and average errors of are also considered in Table 8. According to two tables, all the “relerr” and “residual” are order and very small. Hence it can be concluded that eigenproblems computed by the real fast algorithm have high precision, and the method always converges.


This work is supported by National Natural Science Foundation of China nos. 11101087 and 10261012 and supported by Research Start Foundation of Fuzhou University nos. 022427 and 600629.