#### Abstract

We study the Hermitian positive definite solutions of the nonlinear matrix equation , where is an nonsingular matrix. Some necessary and sufficient conditions for the existence of a Hermitian positive definite solution of this equation are given. However, based on the necessary and sufficient conditions, some properties and the equivalent equations of are presented while the matrix equation has a Hermitian positive definite solution.

#### 1. Introduction

In this paper, we conside the Hermitian positive definite solutions of the matrix equation: where is an nonsingular matrix, denotes the identity matrix, and the conjugate transpose of .

In many physical applications [1] (such as super-resolution image restoration, a new image processing branch, which restores one better image of higher resolution from a degraded image sequence of lower resolution, algorithms of super-resolution image restoration can be used wherever image resolution is not contented, and image sequence can be obtained), we must solve a system of linear equations [2]: where the positive definite matrix arises from a finite different approximation to an elliptic partial differential equation. The solving of the System can be transformed to the solving of the equations , and the nonlinear matrix equation (1.1), appears in such many applications including control theory, ladder networks, dynamic programming, stochastic filtering, statistic, and super-resolution image restoration. For such decomposition to exist the matrix must be a positive definite solution to the matrix equations .

We know that the study of the Hermitian positive definite solutions of the matrix equation has three basic problems, the theoretic issue on solvability; the numerical solution; the analysis of the perturbation.

Because of the wide mathematical and physical background, in recent years, the existence of Hermitian positive solutions for nonlinear matrix equation received wide attention (1.1) [3], for various values of . In [3], Ivanov et al. construct iterative methods for obtaining positive definite solutions of the matrix equation (1.1) and give some sufficient conditions for the existence of a positive definite solution. Moreover, Ferrante and Levy [4] studied the equation , and presented an algorithm which converges to the positive definite solution for a wide range of conditions. Cheng [5] presented some sufficient conditions and new necessary conditions for the existence of Hermitian positive definite solutions. In [6], Ivanov discussed sufficient conditions for existence the minimal and special positive definite solutions are derived and iterative procedures for computing these solutions. The problem in the Hermitian positive definite solutions of the matrix equation has been extensively researched in [7–12].

Throughout this paper, let denote the spectral norm for square matrices unless otherwise noted, that is, , where the are the eigenvalues of , the identity matrix, and the conjugate transpose of . The notation means that is a positive definite Hermitian matrix, is used to indicate that , that is, be Hermitian positive definite.

In this paper, motivated by the results mentioned earlier, we give some necessary and sufficient conditions for the existence of a Hermitian positive definite solution of (1.1), and these necessary and sufficient conditions are different to [3, 5]. Based on them, we also present some properties of the coefficient matrix are presented and two equivalent equations of (1.1) when the matrix equation has a Hermitian positive definite solution.

#### 2. Preliminaries and Lemmas

The following are several basic properties and are useful in this paper.

Lemma 2.1. *The spectral norm is monotonic norm, that is, if , then .*

Lemma 2.2. *If is positive definite Hermitian matrix, then there is a unique positive definite Hermitian matrix such that .*

Lemma 2.3 (see [3, Corollary 2.1]). *If is a positive definite solution of (1.1), then
*

Lemma 2.4 (see [13, Theorem 4]). *If (1.1) has a solution, then
**
Moreover, if has order , then can take any value in the interval for (1.1) to have a solution.*

#### 3. The Main Results

In this section, we present our main results.

Theorem 3.1. *The matrix equation (1.1) has a Hermitian positive definite solution if and only if unitary equivalent to , that is, can be factorized as
**
where , are the eigenvalues of , , and the columns of
**
are orthonormal. In this case is a Hermitian positive definite solution of (1.1) and all Hermitian positive definite solutions of (1.1) can be formed in this way. By the result mentioned earlier, the solving of (1.1) can be transformed to the solving of the equation
**
where is a nonsingular Hermitian matrix, , are the eigenvalues of , .*

*Proof. *If the matrix equation (1.1) has a Hermitian positive definite solution , then there exists a unitary matrix satisfied where are the eigenvalues of . Let , then , we rewrite the matrix equation (1.1) as
so
Because is a unitary matrix, hence
Thus
Let , then and by (3.3), it is easy to see that the columns of
are orthonormal.

Conversely, suppose that has the decomposition (3.1) and , where is a unitary matrix. Since the columns of
are orthonormal, so
that is,

Then, we have

Hence is a Hermitian positive definite solution of (1.1).

The proof of Theorem 3.1 is complete.

Theorem 3.2. *If the matrix equation (1.1) has a Hermitian positive definite solution , then
**
where are the eigenvalues of .*

*Proof. *By Theorem 3.1, if is a Hermitian positive definite solution of (3.3), then we have
thus
hence , that is,
where .

Without loss of generality, suppose , then . By the proof of Theorem 3.1, we have
Let such that , where . Since , then , and , that is,

Now let
where is -column vector, then , so is column orthogonal and . Let , and with CS factorization theorem, extend them to form an orthogonal basis of . Thus is a unitary matrix such that , and hence
So
thus we have
that is,
that is,
The proof of Theorem 3.2 is complete.

Theorem 3.3. *If the matrix equation (1.1) has a Hermitian positive definite solution , let , where are the eigenvalues of , then*(1)*;*(2)*.*

*Proof. *By Theorem 3.1, , where is a unitary matrix, hence .

Let , where be the eigenvalues of .

(1) By Theorem 3.1, is a Hermitian positive definite solution of (3.3), thus
hence

So

Since is a Hermitian positive definite solution of (3.3), is a nonsingular matrix, that is, , by (1) , then
so
thus we have

The proof of Theorem 3.3 is complete.

Theorem 3.4. *Suppose and are two positive definite Hermitian matrices, and then there exist a nonsingular matrices which satisfied*(1)*
where , are the eigenvalues of ;*(2)*, and the columns of
are orthonormal, .*

*Proof. *(1) Since the matrix is positive definite Hermitian, then there exists a unique positive definite Hermitian matrix such that . We note
It is to prove that is Hermitian matrix also, so there exists a unitary matrix such that , where , are the eigenvalues of . Hence we have the following equation
Let , then is nonsingular. Since
so the eigenvalue of is equal to the eigenvalue of .

(2) If (3.31) is hold, then we have
Since , is nonsingular, so is positive definite Hermitian.

Thus , that is, . It is easy to see that the columns of
are orthonormal.

The proof of Theorem 3.4 is complete.

Theorem 3.5. *The matrix equation (1.1) has a Hermitian positive definite solution if and only if has the following factorization:
**
where are the eigenvalues of and the columns of
**
are orthonormal. In this case is a Hermitian positive definite solution and all Hermitian positive definite solutions can be formed in this way.*

*Proof. *If the matrix equation (1.1) has a Hermitian positive definite solution , then by Theorem 3.3, there exists a nonsingular matrix which satisfied
hence

So (1.1) is equivalent to
or equivalently
Let , then and (3.43) means that the columns of
are orthonormal.

Conversely, suppose that has the decomposition and , where is a unitary matrix. Since the columns of
are orthonormal, so
Hence
Then we have
The proof of Theorem 3.5 is complete.

Theorem 3.6. *The matrix equation (1.1) has a Hermitian positive definite solution if and only if has the following factorization:
**
where are the eigenvalues of and the columns of
**
are orthonormal. In this case is a Hermitian positive definite solution and all Hermitian positive definite solutions can be formed in this way.*

*Proof. *If the matrix equation (1.1) has a Hermitian positive definite solution , then by Theorem 3.3, there exists a nonsingular matrix which satisfied
hence

So (1.1) is equivalent to
or equivalently
Let , then and (3.54) means that the columns of
are orthonormal.

Conversely, suppose that has the decomposition and , where is a unitary matrix. Since the columns of
are orthonormal, then
Thus
Then we have
The proof of Theorem 3.6 is complete.

Theorem 3.7. *The matrix equation (1.1) has a Hermitian positive definite solution if and only if is unitary, where are the eigenvalues of .*

*Proof. *If the matrix equation (1.1) has a Hermitian positive definite solution , then by Theorem 3.3, there exists a nonsingular matrix which satisfied
hence

So (1.1) is equivalent to
that is,
Let , then (3.63) is equivalent to
So
Thus is unitary.

The proof of Theorem 3.7 is complete.

Similarly, we have the following theorem.

Theorem 3.8. *The matrix equation (1.1) has a Hermitian positive definite solution if and only if is unitary, where are the eigenvalues of .*

*Remark 3.9. *By Theorems 3.7 and 3.8, the solving of (1.1) is transformed to the solving of
where are the eigenvalues of or is a nonsingular Hermitian matrix.

It is easy to see that every eigenvalue of or satisfied

#### Acknowledgment

This research was supported by KJM of Shandong University of Technology Grant (2005KJM30).