Abstract

We study several properties of matrix variate beta type 3 distribution. We also derive probability density functions of the product of two independent random matrices when one of them is beta type 3. These densities are expressed in terms of Appell's first hypergeometric function and Humbert's confluent hypergeometric function of matrix arguments. Further, a bimatrix variate generalization of the beta type 3 distribution is also defined and studied.

1. Introduction

The beta families of distributions are defined by the density functions

The integral representations of the confluent hypergeometric function and the Gauss hypergeometric function are given by

where signifies that irreducible representation of indexed by occurs in the decomposition of the Kronecker product of the irreducible representations indexed by and . Further

In expressions (2.15) and (2.16), is defined by

Note that , which is the multivariate gamma function.

The matrix variate generalizations of (1.1), (1.2), and (1.4) are given as follows (Gupta and Nagar [3, 4]).

Definition 2.1. An random symmetric positive definite matrix is said to have a matrix variate beta type 1 distribution with parameters , denoted as , if its p.d.f. is given by where and .

If , then the cumulative distribution function is given by

Definition 2.2. An random symmetric positive definite matrix is said to have a matrix variate beta type 2 distribution with parameters , denoted as , if its p.d.f. is given by where and .

Definition 2.3. An random symmetric positive definite matrix is said to have a matrix variate beta type 3 distribution with parameters , denoted as , if its p.d.f. is given by where and .

3. Hypergeometric Functions of Two Matrices

In this section we define Appell’s first hypergeometric function and Humbert’s confluent hypergeometric function of symmetric matrices and and give their series expansions involving invariant polynomials. Following Prudnikov et al. [10, equations 7.2.4(), ()], and are defined as respectively, where and . Note that for , and reduce to and functions, respectively. Expanding , , , and using (2.6) and (2.5), and applying (2.14), one can write Now, substituting (3.3) and (3.4) in (3.1) and (3.2), respectively, and integrating using (2.16), the series expansions for and are derived as

4. Properties

In this section we derive several properties of the matrix variate beta type 3 distribution. For the sake of completeness we first state the following results established in Gupta and Nagar [4].

(1)Let and be a constant nonsingular matrix. Then, the density of is (2)Let and be an orthogonal matrix, whose elements are either constants or random variables distributed independent of . Then, the distribution of is invariant under the transformation , and is independent of in the latter case.(3)Let . Then, the density of is (4)If , then and . (5)If , then and .(6)If , then , , , and .(7)Let , . Define and . If , then and .(8)Let be a constant matrix of rank (). If , then .(9)Let and , , then . Further, if is a random vector, independent of , and , then it follows that .

From the above results it is straightforward to show that, if () is a nonzero constant vector or a random vector independent of with , then The expectation of , , can easily be obtained from the above results. For any fixed , , where Hence, for all , which implies that The matrix variate beta type 3 distribution can be derived by using independent gamma matrices. An random symmetric positive definite matrix is said to have a matrix variate gamma distribution with parameters (), and (), denoted by , if its p.d.f. is given by It is well known that if and are independent, , , then (i) and are independent and (ii) and are independent. Further, , and . In the following theorem we derive similar result for matrix variate beta type 3 distribution.

Theorem 4.1. Let the random matrices and be independent, , . Then, .

Proof. The joint density function of and is given by Making the transformation and with the Jacobian in the joint density of and , we obtain the joint density of and as Now, the desired result is obtained by integrating using (2.1).

Next, we derive the cumulative distribution function (cdf) and several expected values of functions of beta type 3 matrix.

If , then the cdf of , denoted by , is given by where . Now, using (2.19), the cdf is obtained as where

Theorem 4.2. Let , then where Re and Re.

Proof. By definition Writing and substituting , we have where the integral has been evaluated using integral representation of the Gauss hypergeometric function given in (2.10).

Corollary 4.3. Let , then for Re, one has Further, for Re,

From the density of , we have Now, expanding in series involving zonal polynomials using (2.6), the above expression is rewritten as Further, writing and integrating using (2.15), we get

5. Distributions of Random Quadratic Forms

In this section we obtain distributional results for the product of two independent random matrices involving beta type 3 distribution.

Theorem 5.1. Let and be independent. Then, the p.d.f. of is

Proof. Using the independence, the joint p.d.f. of and is given by where , and Transforming , with the Jacobian we obtain the joint p.d.f. of and as where . To find the marginal p.d.f. of , we integrate (5.4) with respect to to get In (5.5) change of variable with the Jacobian yields where the last step has been obtained by using the definition of . Finally, substituting for we obtain the desired result.

Corollary 5.2. Let and be independent random matrices, and . If , then the p.d.f. of is given by

Theorem 5.3. Let and be independent random matrices, and . Then, the p.d.f. of is given by

Proof. Since and are independent, their joint p.d.f. is given by where , , and Now consider the transformation and whose Jacobian is . Thus, we obtain the joint p.d.f. of and as where and . Finally, integrating using (3.1) and substituting for , we obtain the desired result.

In the next theorem we derive the density of , where the random matrices and are independent, , and the distribution of is matrix variate gamma.

Theorem 5.4. Let the random matrices and be independent, and . Then, the p.d.f. of is given by where

Proof. The joint p.d.f. of and is given by where and . Now, transforming and , with the Jacobian , we obtain the joint p.d.f. of and as where and . Now, integrating using (3.2), we get the marginal density of .

6. Bimatrix Beta Type 3 Distribution

The bimatrix generalization of the beta type 1 density is defined by where , , , and This distribution, denoted by , is a special case of the matrix variate Dirichlet type 1 distribution. The random symmetric positive definite matrices and are said to have a bimatrix variate generalization of the beta type 2 distribution, denoted as , if their joint p.d.f. is given by where , , and .

A natural bimatrix generalization of the beta type 3 distribution can be given as follows.

Definition 6.1. The symmetric positive definite random matrices and are said to have a bimatrix beta type 3 distribution, denoted as , if their joint p.d.f. is given by where , , and .

The bimatrix beta type 3 distribution belongs to the Liouville family of distributions and can be obtained using independent gamma matrices as shown in the following theorem.

Theorem 6.2. Let , , and be independent, , . Define , Then, .

Proof. Similar to the proof of Theorem 4.1.

The next two theorems derive the bimatrix beta type 3 distribution from the bimatrix beta type 1 and type 2 distributions.

Theorem 6.3. Let and define Then, .

Proof. Let and . Then, . The Jacobian of the transformation (6.5) is given by Now, substituting , and the Jacobian in the joint density of and given in (6.1), we get the desired result.

Theorem 6.4. Let and define Then, .

Proof. Let and . Then, . The Jacobian of the transformation (6.7) is given by Now, substitution of , , along with the Jacobian in the joint density of and given in (6.3) yields the desired result.

The marginal distribution of , when the random matrices and follow a bimatrix beta type 3 distribution, is given next.

Theorem 6.5. Let . Then, the marginal p.d.f. of is given by where . Further, .

Proof. Substituting with the Jacobian in (6.4), the joint density of and is derived as Now, integration of the above expression with respect to yields the marginal density of . Further, by integrating (6.10) with respect to we find the marginal density of as Now, by evaluating the above integral using results on Gauss hypergeometric function, we obtain Finally, substituting (6.12) in (6.11) and simplifying the resulting expression we obtain the desired result.

Using the result the Gauss hypergeometric function given in (6.9) can be rewritten as Hence, the density of can also be written as It can clearly be observed that the p.d.f. in (6.9) is not a beta type 3 density and differs by a factor involving . In the next theorem we give distribution of sum of random matrices distributed jointly as bimatrix beta type 3.

Theorem 6.6. Let . Define and . Then, (i) and are independently distributed, (ii) , and (iii) .

Proof. Making the transformation and with the Jacobian in the joint density of given by (6.4), we get the joint density of and as where and . From the above factorization, it is easy to see that and are independently distributed. Further, and .

Using Theorem 6.6, the joint moments of and are given by where and . Now, computing and using Corollary 4.3 and (2.20) and simplifying the resulting expression, we obtain

Acknowledgment

The research work of D. K. Nagar was supported by the Comité para el Desarrollo de la Investigación, Universidad de Antioquia research Grant no. IN550CE.