International Journal of Mathematics and Mathematical Sciences

Volume 2009, Article ID 308518, 18 pages

http://dx.doi.org/10.1155/2009/308518

## Properties of Matrix Variate Beta Type 3 Distribution

^{1}Department of Mathematics and Statistics, Bowling Green State University, Bowling Green, OH 43403-0221, USA^{2}Departamento de Matemáticas, Universidad de Antioquia, Calle 67, No. 53-108, Medellín, Colombia

Received 27 September 2008; Accepted 29 May 2009

Academic Editor: Kenneth Berenhaut

Copyright © 2009 Arjun K. Gupta and Daya K. Nagar. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

We study several properties of matrix variate beta type 3 distribution. We also derive probability density functions of the product of two independent random matrices when one of them is beta type 3. These densities are expressed in terms of Appell's first hypergeometric function and Humbert's confluent hypergeometric function of matrix arguments. Further, a bimatrix variate generalization of the beta type 3 distribution is also defined and studied.

#### 1. Introduction

The beta families of distributions are defined by the density functions

The integral representations of the confluent hypergeometric function and the Gauss hypergeometric function are given by

where signifies that irreducible representation of indexed by occurs in the decomposition of the Kronecker product of the irreducible representations indexed by and . Further

In expressions (2.15) and (2.16), is defined by

Note that , which is the multivariate gamma function.

The matrix variate generalizations of (1.1), (1.2), and (1.4) are given as follows (Gupta and Nagar [3, 4]).

*Definition 2.1. *An random symmetric positive definite matrix is said to have a matrix variate beta type 1 distribution with parameters , denoted as , if its p.d.f. is given by
where and .

If , then the cumulative distribution function is given by

*Definition 2.2. *An random symmetric positive definite matrix is said to have a matrix variate beta type 2 distribution with parameters , denoted as , if its p.d.f. is given by
where and .

*Definition 2.3. *An random symmetric positive definite matrix is said to have a matrix variate beta type 3 distribution with parameters , denoted as , if its p.d.f. is given by
where and .

#### 3. Hypergeometric Functions of Two Matrices

In this section we define Appell’s first hypergeometric function and Humbert’s confluent hypergeometric function of symmetric matrices and and give their series expansions involving invariant polynomials. Following Prudnikov et al. [10, equations 7.2.4(), ()], and are defined as respectively, where and . Note that for , and reduce to and functions, respectively. Expanding , , , and using (2.6) and (2.5), and applying (2.14), one can write Now, substituting (3.3) and (3.4) in (3.1) and (3.2), respectively, and integrating using (2.16), the series expansions for and are derived as

#### 4. Properties

In this section we derive several properties of the matrix variate beta type 3 distribution. For the sake of completeness we first state the following results established in Gupta and Nagar [4].

(1)Let and be a constant nonsingular matrix. Then, the density of is (2)Let and be an orthogonal matrix, whose elements are either constants or random variables distributed independent of . Then, the distribution of is invariant under the transformation , and is independent of in the latter case.(3)Let . Then, the density of is (4)If , then and . (5)If , then and .(6)If , then , , , and .(7)Let , . Define and . If , then and .(8)Let be a constant matrix of rank (). If , then .(9)Let and , , then . Further, if is a random vector, independent of , and , then it follows that .From the above results it is straightforward to show that, if () is a nonzero constant vector or a random vector independent of with , then The expectation of , , can easily be obtained from the above results. For any fixed , , where Hence, for all , which implies that The matrix variate beta type 3 distribution can be derived by using independent gamma matrices. An random symmetric positive definite matrix is said to have a matrix variate gamma distribution with parameters (), and (), denoted by , if its p.d.f. is given by It is well known that if and are independent, , , then (i) and are independent and (ii) and are independent. Further, , and . In the following theorem we derive similar result for matrix variate beta type 3 distribution.

Theorem 4.1. *Let the random matrices and be independent, , . Then, .*

*Proof. *The joint density function of and is given by
Making the transformation and with the Jacobian in the joint density of and , we obtain the joint density of and as
Now, the desired result is obtained by integrating using (2.1).

Next, we derive the cumulative distribution function (cdf) and several expected values of functions of beta type 3 matrix.

If , then the cdf of , denoted by , is given by where . Now, using (2.19), the cdf is obtained as where

Theorem 4.2. *Let , then
**
where Re and Re.*

*Proof. *By definition
Writing
and substituting , we have
where the integral has been evaluated using integral representation of the Gauss hypergeometric function given in (2.10).

Corollary 4.3. *Let , then for Re, one has
**
Further, for Re,
*

From the density of , we have Now, expanding in series involving zonal polynomials using (2.6), the above expression is rewritten as Further, writing and integrating using (2.15), we get

#### 5. Distributions of Random Quadratic Forms

In this section we obtain distributional results for the product of two independent random matrices involving beta type 3 distribution.

Theorem 5.1. *Let and be independent. Then, the p.d.f. of is
*

*Proof. *Using the independence, the joint p.d.f. of and is given by
where , and
Transforming , with the Jacobian we obtain the joint p.d.f. of and as
where . To find the marginal p.d.f. of , we integrate (5.4) with respect to to get
In (5.5) change of variable with the Jacobian yields
where the last step has been obtained by using the definition of . Finally, substituting for we obtain the desired result.

Corollary 5.2. *Let and be independent random matrices, and . If , then the p.d.f. of is given by
*

Theorem 5.3. *Let and be independent random matrices, and . Then, the p.d.f. of is given by
*

*Proof. *Since and are independent, their joint p.d.f. is given by
where , , and
Now consider the transformation and whose Jacobian is . Thus, we obtain the joint p.d.f. of and as
where and . Finally, integrating using (3.1) and substituting for , we obtain the desired result.

In the next theorem we derive the density of , where the random matrices and are independent, , and the distribution of is matrix variate gamma.

Theorem 5.4. *Let the random matrices and be independent, and . Then, the p.d.f. of is given by
**
where *

*Proof. *The joint p.d.f. of and is given by
where and . Now, transforming and , with the Jacobian , we obtain the joint p.d.f. of and as
where and . Now, integrating using (3.2), we get the marginal density of .

#### 6. Bimatrix Beta Type 3 Distribution

The bimatrix generalization of the beta type 1 density is defined by where , , , and This distribution, denoted by , is a special case of the matrix variate Dirichlet type 1 distribution. The random symmetric positive definite matrices and are said to have a bimatrix variate generalization of the beta type 2 distribution, denoted as , if their joint p.d.f. is given by where , , and .

A natural bimatrix generalization of the beta type 3 distribution can be given as follows.

*Definition 6.1. *The symmetric positive definite random matrices and are said to have a bimatrix beta type 3 distribution, denoted as , if their joint p.d.f. is given by
where , , and .

The bimatrix beta type 3 distribution belongs to the Liouville family of distributions and can be obtained using independent gamma matrices as shown in the following theorem.

Theorem 6.2. *Let , , and be independent, , . Define , Then, .*

*Proof. *Similar to the proof of Theorem 4.1.

The next two theorems derive the bimatrix beta type 3 distribution from the bimatrix beta type 1 and type 2 distributions.

Theorem 6.3. *Let and define
**
Then, .*

*Proof. *Let and . Then, . The Jacobian of the transformation (6.5) is given by
Now, substituting , and the Jacobian in the joint density of and given in (6.1), we get the desired result.

Theorem 6.4. *Let and define
**
Then, .*

*Proof. *Let and . Then, . The Jacobian of the transformation (6.7) is given by
Now, substitution of , , along with the Jacobian in the joint density of and given in (6.3) yields the desired result.

The marginal distribution of , when the random matrices and follow a bimatrix beta type 3 distribution, is given next.

Theorem 6.5. *Let . Then, the marginal p.d.f. of is given by
**
where . Further, .*

*Proof. *Substituting with the Jacobian in (6.4), the joint density of and is derived as
Now, integration of the above expression with respect to yields the marginal density of . Further, by integrating (6.10) with respect to we find the marginal density of as
Now, by evaluating the above integral using results on Gauss hypergeometric function, we obtain
Finally, substituting (6.12) in (6.11) and simplifying the resulting expression we obtain the desired result.

Using the result the Gauss hypergeometric function given in (6.9) can be rewritten as Hence, the density of can also be written as It can clearly be observed that the p.d.f. in (6.9) is not a beta type 3 density and differs by a factor involving . In the next theorem we give distribution of sum of random matrices distributed jointly as bimatrix beta type 3.

Theorem 6.6. *Let . Define and . Then, (i) and are independently distributed, (ii) , and (iii) .*

*Proof. *Making the transformation and with the Jacobian in the joint density of given by (6.4), we get the joint density of and as
where and . From the above factorization, it is easy to see that and are independently distributed. Further, and .

Using Theorem 6.6, the joint moments of and are given by where and . Now, computing and using Corollary 4.3 and (2.20) and simplifying the resulting expression, we obtain

#### Acknowledgment

The research work of D. K. Nagar was supported by the Comité para el Desarrollo de la Investigación, Universidad de Antioquia research Grant no. IN550CE.

#### References

- N. L. Johnson, S. Kotz, and N. Balakrishnan,
*Continuous Univariate Distributions. Vol. 2*, Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics, John Wiley & Sons, New York, NY, USA, 2nd edition, 1995. View at MathSciNet - L. Cardeño, D. K. Nagar, and L. E. Sánchez, “Beta type 3 distribution and its multivariate generalization,”
*Tamsui Oxford Journal of Mathematical Sciences*, vol. 21, no. 2, pp. 225–241, 2005. View at Google Scholar · View at MathSciNet - A. K. Gupta and D. K. Nagar,
*Matrix Variate Distributions*, vol. 104 of*Chapman & Hall/CRC Monographs and Surveys in Pure and Applied Mathematics*, Chapman & Hall/CRC, Boca Raton, Fla, USA, 2000. View at MathSciNet - A. K. Gupta and D. K. Nagar, “Matrix-variate beta distribution,”
*International Journal of Mathematics and Mathematical Sciences*, vol. 24, no. 7, pp. 449–459, 2000. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - A. G. Constantine, “Some non-central distribution problems in multivariate analysis,”
*Annals of Mathematical Statistics*, vol. 34, pp. 1270–1285, 1963. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - A. W. Davis, “Invariant polynomials with two matrix arguments extending the zonal polynomials: applications to multivariate distribution theory,”
*Annals of the Institute of Statistical Mathematics*, vol. 31, no. 3, pp. 465–485, 1979. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - A. W. Davis, “Invariant polynomials with two matrix arguments, extending the zonal polynomials,” in
*Multivariate Analysis, V (Proc. Fifth Internat. Sympos., Univ. Pittsburgh, Pittsburgh, Pa., 1978)*, P. R. Krishnaiah, Ed., pp. 287–299, North-Holland, Amsterdam, The Netherlands, 1980. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. Chikuse, “Distributions of some matrix variates and latent roots in multivariate Behrens-Fisher discriminant analysis,”
*The Annals of Statistics*, vol. 9, no. 2, pp. 401–407, 1981. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - D. K. Nagar and A. K. Gupta, “Matrix-variate Kummer-beta distribution,”
*Journal of the Australian Mathematical Society*, vol. 73, no. 1, pp. 11–25, 2002. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - A. P. Prudnikov, Yu. A. Brychkov, and O. I. Marichev,
*Integrals and Series. Vol. 3. More Special Functions*, Gordon and Breach Science, New York, NY, USA, 1990, translated from the Russian by G. G. Gould. View at MathSciNet