Research Article | Open Access

Qing-Wen Wang, Juan Yu, "Constrained Solutions of a System of Matrix Equations", *Journal of Applied Mathematics*, vol. 2012, Article ID 471573, 19 pages, 2012. https://doi.org/10.1155/2012/471573

# Constrained Solutions of a System of Matrix Equations

**Academic Editor:**Panayiotis J. Psarrakos

#### Abstract

We derive the necessary and sufficient conditions of and the expressions for the orthogonal solutions, the symmetric orthogonal solutions, and the skew-symmetric orthogonal solutions of the system of matrix equations and , respectively. When the matrix equations are not consistent, the least squares symmetric orthogonal solutions and the least squares skew-symmetric orthogonal solutions are respectively given. As an auxiliary, an algorithm is provided to compute the least squares symmetric orthogonal solutions, and meanwhile an example is presented to show that it is reasonable.

#### 1. Introduction

Throughout this paper, the following notations will be used. , , , and denote the set of all real matrices, the set of all orthogonal matrices, the set of all symmetric matrices, and the set of all skew-symmetric matrices, respectively. is the identity matrix of order . and represent the transpose and the trace of the real matrix, respectively. stands for the Frobenius norm induced by the inner product. The following two definitions will also be used.

*Definition 1.1 (see [1]). *A real matrix is said to be a symmetric orthogonal matrix if and .

*Definition 1.2 (see [2]). *A real matrix is called a skew-symmetric orthogonal matrix if and .

The set of all symmetric orthogonal matrices and the set of all skew-symmetric orthogonal matrices are, respectively, denoted by and . Since the linear matrix equation(s) and its optimal approximation problem have great applications in structural design, biology, control theory, and linear optimal control, and so forth, see, for example, [3ā5], there has been much attention paid to the linear matrix equation(s). The well-known system of matrix equations as one kind of linear matrix equations, has been investigated by many authors, and a series of important and useful results have been obtained. For instance, the system (1.1) with unknown matrix being bisymmetric, centrosymmetric, bisymmetric nonnegative definite, Hermitian and nonnegative definite, and -(skew) symmetric has been, respectively, investigated by Wang et al. [6, 7], Khatri and Mitra [8], and Zhang and Wang [9]. Of course, if the solvability conditions of system (1.1) are not satisfied, we may consider its least squares solution. For example, Li et al. [10] presented the least squares mirrorsymmetric solution. Yuan [11] got the least-squares solution. Some results concerning the system (1.1) can also be found in [12ā18].

Symmetric orthogonal matrices and skew-symmetric orthogonal matrices play important roles in numerical analysis and numerical solutions of partial differential equations. Papers [1, 2], respectively, derived the symmetric orthogonal solution of the matrix equation and the skew-symmetric orthogonal solution of the matrix equation . Motivated by the work mentioned above, we in this paper will, respectively, study the orthogonal solutions, symmetric orthogonal solutions, and skew-symmetric orthogonal solutions of the system (1.1). Furthermore, if the solvability conditions are not satisfied, the least squares skew-symmetric orthogonal solutions and the least squares symmetric orthogonal solutions of the system (1.1) will be also given.

The remainder of this paper is arranged as follows. In Section 2, some lemmas are provided to give the main results of this paper. In Sections 3, 4, and 5, the necessary and sufficient conditions of and the expression for the orthogonal, the symmetric orthogonal, and the skew-symmetric orthogonal solutions of the system (1.1) are, respectively, obtained. In Section 6, the least squares skew-symmetric orthogonal solutions and the least squares symmetric orthogonal solutions of the system (1.1) are presented, respectively. In addition, an algorithm is provided to compute the least squares symmetric orthogonal solutions, and meanwhile an example is presented to show that it is reasonable. Finally, in Section 7, some concluding remarks are given.

#### 2. Preliminaries

In this section, we will recall some lemmas and the special - decomposition which will be used to get the main results of this paper.

Lemma 2.1 (see [1, Lemmas 1 and 2]). *Given ,āā. The matrix equation has a solution if and only if . Let the singular value decompositions of and be, respectively,
**
where
**
Then the orthogonal solutions of can be described as
**
where is arbitrary. *

Lemma 2.2 (see [2, Lemmas and ]). *Given ,āā. The matrix equation has a solution if and only if . Let the singular value decompositions of and be, respectively,
**
where
**
Then the orthogonal solutions of can be described as
**
where is arbitrary. *

Lemma 2.3 (see [2, Theorem 1]). * If
**
then the - decomposition of can be expressed as
*

*where ,*

Lemma 2.4 (see [1, Theorem ]). *If
**
then the - decomposition of can be described as
*

*where ,āā,āā,*

*Remarks 2.5. *In order to know the *- * decomposition of an orthogonal matrix with a leading (skew-) symmetric submatrix for details, one can deeply study the proof of Theoremāā1 in [1] and [2].

Lemma 2.6. *Given ,āā. Then the matrix equation has a solution if and only if and . When these conditions are satisfied, the general symmetric orthogonal solutions can be expressed as
**
where
**
and is arbitrary. *

*Proof. **The Necessity.* Assume is a solution of the matrix equation , then we have
*The Sufficiency*. Since the equality holds, then by Lemma 2.2, the singular value decompositions of and can be, respectively, expressed as (2.4). Moreover, the condition means
which can be written as
From Lemma 2.2, the orthogonal solutions of the matrix equation can be described as (2.6). Now we aim to find that in (2.6) is also symmetric. Suppose that is symmetric, then we have
together with the partitions of the matrices and in Lemma 2.2, we get
By (2.17), we can get (2.19). Now we aim to find the orthogonal solutions of the system of matrix equations (2.20) and (2.21). Firstly, we obtain from (2.20) that , then by Lemma 2.2, (2.20) has an orthogonal solution . By (2.17), the leading principal submatrix of the orthogonal matrix is symmetric. Then we have, from Lemma 2.4,
From (2.20), (2.22), and (2.23), the orthogonal solution of (2.20) is
where is arbitrary. Combining (2.21), (2.24), and (2.25) yields is a symmetric orthogonal matrix. Denote
then the symmetric orthogonal solutions of the matrix equation can be expressed as
Let the partition matrix be
compatible with the block matrix
Put
then the symmetric orthogonal solutions of the matrix equation can be described as (2.13).

Setting ,āā, and in [2, Theorem 2], and then by Lemmas 2.1 and 2.3, we can have the following result.

Lemma 2.7. *Given ,āā. Then the equation has a solution if and only if and . When these conditions are satisfied, the skew-symmetric orthogonal solutions of the matrix equation can be described as
**
where
**
and is arbitrary. *

#### 3. The Orthogonal Solutions of the System (1.1)

The following theorems give the orthogonal solutions of the system (1.1).

Theorem 3.1. *Given ,āā and ,āā, suppose the singular value decompositions of and are, respectively, as (2.4). Denote
**
where , , and . Let the singular value decompositions of and be, respectively,
**
where ,āā,āā,āā is diagonal, whose diagonal elements are nonzero singular values of or . Then the system (1.1) has orthogonal solutions if and only if
*

*In which case, the orthogonal solutions can be expressed as*

*where*

*and is arbitrary.*

*Proof. * Let the singular value decompositions of and be, respectively, as (2.4). Since the matrix equation has orthogonal solutions if and only if
then by Lemma 2.2, its orthogonal solutions can be expressed as (2.6). Substituting (2.6) and (3.1) into the matrix equation , we have and . By Lemma 2.1, the matrix equation has orthogonal solution if and only if
Let the singular value decompositions of and be, respectively,
where ,āā,āā,āā is diagonal, whose diagonal elements are nonzero singular values of or . Then the orthogonal solutions can be described as
where is arbitrary. Therefore, the common orthogonal solutions of the system (1.1) can be expressed as
where
and is arbitrary.

The following theorem can be shown similarly.

Theorem 3.2. *Given ,āā and ,āā, let the singular value decompositions of and be, respectively, as (2.1). Partition
**
where ,āā. Assume the singular value decompositions of and are, respectively,
**
where ,āā,āā,āāis diagonal, whose diagonal elements are nonzero singular values of or . Then the system (1.1) has orthogonal solutions if and only if
*

*In which case, the orthogonal solutions can be expressed as*

*where*

*and is arbitrary.*

#### 4. The Symmetric Orthogonal Solutions of the System (1.1)

We now present the symmetric orthogonal solutions of the system (1.1).

Theorem 4.1. *Given ,āā. Let the symmetric orthogonal solutions of the matrix equation be described as in Lemma 2.6. Partition
**
where ,āā. Then the system (1.1) has symmetric orthogonal solutions if and only if
*

*In which case, the solutions can be expressed as*

*where*

*and is .*

*Proof. * From Lemma 2.6, we obtain that the matrix equation has symmetric orthogonal solutions if and only if and . When these conditions are satisfied, the general symmetric orthogonal solutions can be expressed as
where is arbitrary, ,āā. Inserting (4.1) and (4.5) into the matrix equation , we get and . By [1, Theoremāā2], the matrix equation has a symmetric orthogonal solution if and only if
In which case, the solutions can be described as
where is arbitrary, , and . Hence the system (1.1) has symmetric orthogonal solutions if and only if all equalities in (4.2) hold. In which case, the solutions can be expressed as
that is, the expression in (4.3).

The following theorem can also be obtained by the method used in the proof of Theorem 4.1.

Theorem 4.2. *Given ,āā. Let the symmetric orthogonal solutions of the matrix equation be described as
**
where ,āā,āā. Partition
**
Then the system (1.1) has symmetric orthogonal solutions if and only if
*

*In which case, the solutions can be expressed as*

*where*

*and is arbitrary.*

#### 5. The Skew-Symmetric Orthogonal Solutions of the System (1.1)

In this section, we show the skew-symmetric orthogonal solutions of the system (1.1).

Theorem 5.1. *Given ,āā. Suppose the matrix equation has skew-symmetric orthogonal solutions with the form
**
where is arbitrary, ,āā. Partition
**
where ,āā. Then the system (1.1) has skew-symmetric orthogonal solutions if and only if
*

*In which case, the solutions can be expressed as*

*where*

*and is arbitrary.*

*Proof. * By [2, Theorem 2], the matrix equation has the skew-symmetric orthogonal solutions if and only if and . When these conditions are satisfied, the general skew-symmetric orthogonal solutions can be expressed as (5.1). Substituting (5.1) and (5.2) into the matrix equation , we get and . From Lemma 2.7, equation has a skew-symmetric orthogonal solution if and only if
When these conditions are satisfied, the solution can be described as
where is arbitrary, ,āā. Inserting (5.7) into (5.1) yields that the system (1.1) has skew-symmetric orthogonal solutions if and only if all equalities in (5.3) hold. In which case, the solutions can be expressed as (5.4).

Similarly, the following theorem holds.

Theorem 5.2. * Given ,āā. Suppose the matrix equation has skew-symmetric orthogonal solutions with the form
**
where is arbitrary, ,āā. Partition
**
where ,āā. Then the system (1.1) has skew-symmetric orthogonal solutions if and only if
*

*In which case, the solutions can be expressed as*

*where*

*and is arbitrary.*

#### 6. The Least Squares (Skew-) Symmetric Orthogonal Solutions of the System (1.1)

If the solvability conditions of a system of matrix equations are not satisfied, it is natural to consider its least squares solution. In this section, we get the least squares (skew-) symmetric orthogonal solutions of the system (1.1), that is, seek such that With the help of the definition of the Frobenius norm and the properties of the skew-symmetric orthogonal matrix, we get that Let Then, it follows from the skew-symmetric matrix that Therefore, (6.1) holds if and only if (6.4) reaches its maximum. Now, we pay our attention to find the maximum value of (6.4). Assume the eigenvalue decomposition of is with Denote partitioned according to then (6.4) has the following form: Thus, by where Equation (6.9) gets its maximum. Since is skew-symmetric, it follows from where is arbitrary, that (6.1) obtains its minimum. Hence we have the following theorem.

Theorem 6.1. *Given and , denote
**
and let the spectral decomposition of be (6.5). Then the least squares skew-symmetric orthogonal solutions of the system (1.1) can be expressed as (6.12). *

If in (6.1) is a symmetric orthogonal matrix, then by the definition of the Frobenius norm and the properties of the symmetric orthogonal matrix, (6.2) holds. Let Then we get that Thus (6.15) reaches its minimum if and only if obtains its maximum. Now, we focus on finding the maximum value of . Let the spectral decomposition of the symmetric matrix be where Denote being compatible with Then Therefore, it follows from that (6.20) reaches its maximum. Since is a symmetric orthogonal matrix, then when has the form where is arbitrary, (6.15) gets its minimum. Thus we obtain the following theorem.

Theorem 6.2. * Given ,āā, denote
**
and let the eigenvalue decomposition of be (6.16). Then the least squares symmetric orthogonal solutions of the system (1.1) can be described as (6.22). *

*Algorithm 6.3. *Consider the following.. Input and .. Compute
. Compute the spectral decomposition of with the form (6.16).. Compute the least squares symmetric orthogonal solutions of (1.1) according to (6.22).

*Example 6.4. *Assume

It can be verified that the given matrices , and do not satisfy the solvability conditions in Theorem 4.1 or Theorem 4.2. So we intend to derive the least squares symmetric orthogonal solutions of the system (1.1). By Algorithm 6.3, we have the following results:(1)the least squares symmetric orthogonal solution (2)

*Remark 6.5. *(1) There exists a unique symmetric orthogonal solution such that (6.1) holds if and only if the matrix
where
is invertible. Example 6.4 just illustrates it.

(2) The algorithm about computing the least squares skew-symmetric orthogonal solutions of the system (1.1) can be shown similarly; we omit it here.

#### 7. Conclusions

This paper is devoted to giving the solvability conditions of and the expressions of the orthogonal solutions, the symmetric orthogonal solutions, and the skew-symmetric orthogonal solutions to the system (1.1), respectively, and meanwhile obtaining the least squares symmetric orthogonal and skew-symmetric orthogonal solutions of the system (1.1). In addition, an algorithm and an example have been provided to compute its least squares symmetric orthogonal solutions.

#### Acknowledgments

This research was supported by the grants from Innovation Program of Shanghai Municipal Education Commission (13ZZ080), the National Natural Science Foundation of China (11171205), the Natural Science Foundation of Shanghai (11ZR1412500), the Discipline Project at the corresponding level of Shanghai (A. 13-0101-12-005), and Shanghai Leading Academic Discipline Project (J50101).

#### References

- C. J. Meng and X. Y. Hu, āThe inverse problem of symmetric orthogonal matrices and its optimal approximation,ā
*Mathematica Numerica Sinica*, vol. 28, pp. 269ā280, 2006 (Chinese). View at: Google Scholar - C. J. Meng, X. Y. Hu, and L. Zhang, āThe skew-symmetric orthogonal solutions of the matrix equation $AX=B$,ā
*Linear Algebra and Its Applications*, vol. 402, no. 1–3, pp. 303ā318, 2005. View at: Publisher Site | Google Scholar - D. L. Chu, H. C. Chan, and D. W. C. Ho, āRegularization of singular systems by derivative and proportional output feedback,ā
*SIAM Journal on Matrix Analysis and Applications*, vol. 19, no. 1, pp. 21ā38, 1998. View at: Google Scholar - K. T. Joseph, āInverse eigenvalue problem in structural design,ā
*AIAA Journal*, vol. 30, no. 12, pp. 2890ā2896, 1992. View at: Google Scholar - A. Jameson and E. Kreinder, āInverse problem of linear optimal control,ā
*SIAM Journal on Control*, vol. 11, no. 1, pp. 1ā19, 1973. View at: Publisher Site | Google Scholar - Q. W. Wang, āBisymmetric and centrosymmetric solutions to systems of real quaternion matrix equations,ā
*Computers and Mathematics with Applications*, vol. 49, no. 5-6, pp. 641ā650, 2005. View at: Publisher Site | Google Scholar - Q. W. Wang, X. Liu, and S. W. Yu, āThe common bisymmetric nonnegative definite solutions with extreme ranks and inertias to a pair of matrix equations,ā
*Applied Mathematics and Computation*, vol. 218, pp. 2761ā2771, 2011. View at: Publisher Site | Google Scholar - C. G. Khatri and S. K. Mitra, āHermitian and nonnegative definite solutions of linear matrix equations,ā
*SIAM Journal on Applied Mathematics*, vol. 31, pp. 578ā585, 1976. View at: Google Scholar - Q. Zhang and Q. W. Wang, āThe ($P,Q$)-(skew)symmetric extremal rank solutions to a system of quaternion matrix equations,ā
*Applied Mathematics and Computation*, vol. 217, no. 22, pp. 9286ā9296, 2011. View at: Publisher Site | Google Scholar - F. L. Li, X. Y. Hu, and L. Zhang, āLeast-squares mirrorsymmetric solution for matrix equations ($AX=B,XC=D$),ā
*Numerical Mathematics*, vol. 15, pp. 217ā226, 2006. View at: Google Scholar - Y. X. Yuan, āLeast-squares solutions to the matrix equations $AX=B$ and $XC=D$,ā
*Applied Mathematics and Computation*, vol. 216, no. 10, pp. 3120ā3125, 2010. View at: Publisher Site | Google Scholar - A. Dajić and J. J. Koliha, āPositive solutions to the equations $AX=C$ and $XB=D$ for Hilbert space operators,ā
*Journal of Mathematical Analysis and Applications*, vol. 333, no. 2, pp. 567ā576, 2007. View at: Publisher Site | Google Scholar - F. L. Li, X. Y. Hu, and L. Zhang, āThe generalized reflexive solution for a class of matrix equations ($AX=B,XC=D$),ā
*Acta Mathematica Scientia*, vol. 28, no. 1, pp. 185ā193, 2008. View at: Publisher Site | Google Scholar - S. Kumar Mitra, āThe matrix equations $AX=C,XB=D$,ā
*Linear Algebra and Its Applications*, vol. 59, pp. 171ā181, 1984. View at: Google Scholar - Y. Qiu, Z. Zhang, and J. Lu, āThe matrix equations $AX=B,XC=D$ with $PX=sXP$ constraint,ā
*Applied Mathematics and Computation*, vol. 189, no. 2, pp. 1428ā1434, 2007. View at: Publisher Site | Google Scholar - Y. Y. Qiu and A. D. Wang, āLeast squares solutions to the equations $AX=B,XC=B$ with some constraints,ā
*Applied Mathematics and Computation*, vol. 204, no. 2, pp. 872ā880, 2008. View at: Publisher Site | Google Scholar - Q. W. Wang, X. Zhang, and Z. H. He, āOn the Hermitian structures of the solution to a pair of matrix equations,ā
*Linear Multilinear Algebra*, vol. 61, no. 1, pp. 73ā90, 2013. View at: Publisher Site | Google Scholar - Q. X. Xu, āCommon Hermitian and positive solutions to the adjointable operator equations $AX=C,XB=D$,ā
*Linear Algebra and Its Applications*, vol. 429, no. 1, pp. 1ā11, 2008. View at: Publisher Site | Google Scholar

#### Copyright

Copyright © 2012 Qing-Wen Wang and Juan Yu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.