Abstract

This paper presents a full rank factorization of a block matrix without any restriction concerning the group inverse. Applying this factorization, we obtain an explicit representation of the group inverse in terms of four individual blocks of the partitioned matrix without certain restriction. We also derive some important coincidence theorems, including the expressions of the group inverse with Banachiewicz-Schur forms.

1. Introduction

Let denote the set of all complex matrices. We use , , and to denote the range, the null space, and the rank of a matrix , respectively. The Moore-Penrose inverse of a matrix is a matrix which satisfies The Moore-Penrose inverse of is unique, and it is denoted by .

Recall that the group inverse of is the unique matrix satisfying The matrix is called the group inverse of and it is denoted by .

Partitioned matrices are very useful in investigating various properties of generalized inverses and hence can be widely used in the matrix theory and have many other applications (see [14]). There are various useful ways to write a matrix as the product of two or three other matrices that have special properties. For example, linear algebra texts relate Gaussian elimination to the LU factorization and the Gram-Schmidt process to the QR factorization. In this paper, we consider a factorization based on the full rank factorization of a matrix. Our purpose is to provide an integrated theoretical development of and setting for understanding a number of topics in linear algebra, such as the Moore-Penrose inverse and the group inverse.

A full rank factorization of is in the form where is of full column rank and is of full row rank. Any choice in (3) is acceptable throughout the paper, although this factorization is not unique.

For a complex matrix of the form in the case when and is invertible, the Schur complement of in is defined by . Sometimes, we denote the Schur complement of in by . Similarly, if and is invertible, then the Schur complement of in is defined by .

In the case when is not invertible, the generalized Schur complement of in is defined by Similarly, the generalized Schur complement of in is defined by

The Schur complement and generalized Schur complement have quite important applications in the matrix theory, statistics, numerical analysis, applied mathematics, and so forth.

There are a great deal of works [58] for the representations of the generalized inverse of . Various other generalized inverses have also been researched by a lot of researchers, for example, Burns et al. [6], Marsaglia and Styan [8], Benítez and Thome [9], Cvetković-Ilić et al. [10], Miao [11], Chen et al., and so forth [12] and the references therein. The concept of a group inverse has numerous applications in matrix theory, from convergence to Markov chains and from generalized inverses to matrix equations. Furthermore, the group inverse of block matrix has many applications in singular differential equations, Markov chains iterative methods, and so forth [1317]. Some results for the group inverse of a block matrix (operator) can be found in [1830]. Most works in the literature concerning representations for the group inverses of partitioned matrices were carried out under certain restrictions on their blocks. Very recently, Yan [31] obtained an explicit representation of the Moore-Penrose inverse in terms of four individual blocks of the partitioned matrix by using the full rank factorization without any restriction. This motivates us to investigate the representations of the group inverse without certain restrictions.

In this paper, we aimed at a new method in giving the representation of the group inverse for the fact that there is no known representation for , with , , , and arbitrarily. The outline of our paper is as follows. In Section 2, we first present a full rank factorization of using previous results by Marsaglia and Styan [8]. Inspired by this factorization, we extend the analysis to obtain an explicit representation of the group inverse of without any restriction. Furthermore, we discuss variants special forms with the corresponding consequences, including Banachiewicz-Schur forms and some other extensions as well.

2. Representation of the Group Inverse: General Case

Yan [31] initially considered the representation of the Moore-Penrose inverse of the partitioned matrix by using the full rank factorization technique. The following result is borrowed from [31, Theorem 2.2].

For convenience, we first state some notations which will be helpful throughout the paper: Let , , , have the full rank factorizations respectively; then there is a full rank factorization of the block matrix : Now, the Moore-Penrose inverse of can be expressed as . In particular, when is group inverse, let ; then the full rank factorization of is This motivates us to obtain some new results concerning the group inverse by using the full rank factorization related to the group inverse.

Recall that if a matrix is group inverse (which is true when ), then can be expressed in terms of ; that is, Particularly, we have

The following result follows by using [31, Theorem 3.6] and (13).

Theorem 1. Let be defined by (4); then the group inverse of can be expressed as where with

If the -element of is group inverse, we immediately have Theorem 2 by using the full rank factorization of (11).

Theorem 2. Let be defined by (4). Suppose is group inverse; then the group inverse of can be expressed as where , , and , , , , , , , , , , , , , , , , , , , are the same as those in Theorem 1.

The two representations of , (which can be found in [31, Theorem 3.1]), will be helpful in the proofs of the following results.

Theorem 3. Let be defined by (4); then the following statements are true. (a) If is of full column rank and is of full row rank, then (b) If , , then where and .

Proof. (a) If is full row rank, then , and hence , , , , and . Thus, , , , , defined in Theorem 1 can be simplified to which imply So, (19) is reduced to When is full row rank, one gets which implies , , , , and . Thus, Simple computations show that Now, possesses the following form according to (18): Since one gets the expression of by using (13):
(b) If , then , , and such that and . Letting , then By short computations, one gets Hence, If , then , , and such that and . Letting , then which imply So, (19) is reduced to Then, Therefore, we have

Theorem 4. Let be defined by (4), then the following statements are true. If , , and , then where . If , , and , then where . If , , and , , , , then

Proof. (a) Since and , by Theorem 3(b), one gets Since , that is, , then , then the equality previously mentioned is simplified to By using , we have
(b) Similarly to the proof of (a).
(c) Since and , by Theorem 2(b), one gets Since , , , , that is, , , , , then the previous equality is simplified to Moreover, Therefore,

Theorem 5. Let be defined by (4); let be the Schur complement of in ; then the following statements are true. If and are group inverse, , , and , then If and are group inverse, , , , then Let and be group inverse; then , , , and if and only if

Proof. (i) If and , then , , defined in (8) can be simplified to , ; and then there is a full rank factorization according to (11). Thus, where and . Denote by the Schur complement of in the partitioned matrix . Then, Applying the Banachiewicz-Schur formula, we have
Simple computations give Then,
(b) Since and , similar as (a), there is a full rank factorization of such that We also have By using , one gets the Schur complement of in : Hence, Short computations show that Therefore,
(c) Since , , , and , according to the proof of (a) and (b), we have Hence,
By [9, Theorem 2].

Analogous to Theorem 5, if define the Schur complement of in , one can obtain the following results.

Theorem 6. Let be defined by (4); let be the Schur complement of in ; then the following statements are true. If and are group inverse, , , , then If and are group inverse, , , and , then Let and be group inverse; then , , , and if and only if

Proof. The proof is similar to the proof of Theorem 5.

Combining Theorems 5 and 6, we have the following results.

Theorem 7. Let be defined by (4); let , be the Schur complement of and in , respectively. Then the following statements are true. If are group inverse, , , and , then If , , , are group inverse, , , , , , and , then If , , , are group inverse, , , , , , and , then If , , , are group inverse, , , , , , and , then

Theorem 8. Let be defined by (4); let , be the Schur complement of and in , respectively. Then if and only if one of the following conditions holds

Proof. (a) Using Theorem 6(c) and Theorem 7(c), we conclude that if and only if Now, we only need to prove (73) is equivalent to (75). Denote . Then Moreover, we have Thus, . Hence, and . Now, we get and , which means (73) implying (75). Obviously, (75) implies (73). So, (73) is equivalent to (75).
(b) The proof is similar to (a).

3. Applications to the Solution of a Linear System

In this section, we will give an application of the previous results above to the solution of a linear system. Using generalized Schur complement, we can split a larger system into two small linear systems by the following steps.

Let be a linear system. Applying the block Gaussian elimination to the system, we have Hence, we get That is,

Now, the solution of system (79) can be obtained by the two small linear systems previously mentioned. In that case, the operation can be significantly simplified. We will also notice that the Moore-Penrose inverse of can be replaced by other generalized inverses, such as the group inverse, the Drazin inverse and generalized inverse of or even the ordinary inverse .

In the following, we will give the group inverse solutions of the linear system.

Theorem 9. Let be a linear system. Suppose satisfies all the conditions of Theorem 5 (c), partitioning and as which have appropriate sizes with . If , then the solution of linear system (79) can be expressed as where .

Proof. Since , we conclude that is the solution of linear system (79). By Theorem 5 (c), we can get the following: Now, it is easy to see that the solution can be expressed as which are also the group inverse solutions of the two small linear systems of (82), respectively.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (11061005) and the Ministry of Education, Science, and Grants (HCIC201103) of Guangxi Key Laboratory of Hybrid Computational and IC Design Analysis Open Fund.