Journal of Applied Mathematics

Volume 2014 (2014), Article ID 173836, 6 pages

http://dx.doi.org/10.1155/2014/173836

## On the Stochastic Restricted Class Estimator and Stochastic Restricted Class Estimator in Linear Regression Model

^{1}School of Mathematics and Finances, Chongqing University of Arts and Sciences, Chongqing 402160, China^{2}Department of Mathematics and KLDAIP, Chongqing University of Arts and Sciences, Chongqing 402160, China

Received 26 November 2013; Accepted 13 December 2013; Published 2 January 2014

Academic Editor: Ram N. Mohapatra

Copyright © 2014 Jibo Wu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

The stochastic restricted class estimator and stochastic restricted class estimator are proposed for the vector of parameters in a multiple linear regression model with stochastic linear restrictions. The mean squared error matrix of the proposed estimators is derived and compared, and some properties of the proposed estimators are also discussed. Finally, a numerical example is given to show some of the theoretical results.

#### 1. Introduction

The problem of multicollinearity or the ill-conditioned design matrix in linear regression model is very well known in statistics. In order to overcome this problem, different remedies have been introduced. One of the most important estimation methods is to consider biased estimators, such as the principal component regression (PCR) estimator [1], the ridge estimator (ORE) by Hoerl and Kennard [2], the class estimator [3], the Liu estimator (LE) by Liu [4], the class estimator [5], the class estimator [6], and the principal component Liu-type estimator [7].

An alternative method to deal with multicollinearity problem is to consider parameter estimation with some restrictions on the unknown parameters, which may be exact or stochastic restrictions [8]. When stochastic additional restrictions on the parameter vector are supposed to hold, Durbin [9], Theil and Goldberger [10], and Theil [11] proposed the ordinary mixed estimator (OME). By grafting the ordinary regression ridge estimator and LE into the mixed estimation, Li and Yang [12] and Hubert and Wijekoon [13] introduced a stochastic restricted ridge estimator (SRRE) and stochastic restricted Liu estimator (SRLE), respectively, and Liu et al. [14] proposed the weighted mixed almost unbiased ridge estimator in linear regression model.

In this paper, in order to overcome multicollinearity, we introduce a stochastic restricted class estimator and a stochastic restricted class estimator for the vector of parameters in a linear regression model when additional stochastic linear restrictions are assumed to hold. Performance of the proposed estimators with respect to the mean squared error matrix (MSEM) criterion is discussed.

The rest of the paper is organized as follows. The model specifications and the new estimators are introduced in Section 2. Then, the superiority of the proposed estimators is discussed in Section 3 and a numerical example is given to illustrate the behavior of the estimators in Section 4. Finally, some conclusion remarks are given in Section 5.

#### 2. Model Specifications and the Estimators

Consider the linear regression model where is an vector of observation, is an known design matrix of rank , is a vector of unknown parameters, and is an vector of disturbances with expectation and variance-covariance matrix .

For the unrestricted model given by (1), the ORE proposed by Hoerl and Kennard [2] and the LE presented by Liu [4] are defined as follows: where , , , , , , and is the ordinary least squares (OLS) estimator of .

Now let us consider the spectral decomposition of the matrix given as where and are diagonal matrices such that that the main diagonal elements of the matrix are the largest eigenvalues of , while are the remaining eigenvalues. The matrix is orthogonal with consisting of its first columns and consisting of the remaining columns of the matrix . The PCR estimator for can be written as

The class estimator proposed by Baye and Parker [3] and the class estimator proposed by Kaçıranlar and Sakallıoğlu [5] are defined as

Followed by Xu and Yang [15], the class estimator and the class estimator could be rewritten as follows:

In addition to the model (1), let us be given some prior information about in the form of a set of independent stochastic linear restrictions as follows: where is a known matrix of rank , is a vector of disturbances with mean 0 and dispersion matrix , is supposed to be known and positive definite, and the vector can be interpreted as a random variable with expectation . Therefore the restriction (9) does not hold exactly but in the mean, and we suppose to be known, that is, to be the realized value of the random vector, so that all the expectations are conditional on [8]. In the following discussions, we do not mention this separately. Furthermore, it is also supposed that the random vector is stochastically independent of .

For the restricted model specified by (1) and (8), the stochastic restricted ridge estimator (SRRE) proposed by Li and Yang [12] and the stochastic restricted Liu estimator (SRLE) proposed by Hubert and Wijekoon [13] are defined as where is the well-known ordinary mixed estimator (OME) of .

We are now ready to propose a new stochastic restricted class estimator which is defined by combing the OME and class estimator and a new stochastic restricted class estimator which is defined by combing the OME and class estimator as follows:

From (11), we can see that(1)when , we may conclude that and ;(2)when , we may conclude that and .

At the end of this section, we will list some lemmas which are needed in the following proofs.

Lemma 1 (see [8]). *Assume that square matrices and are not singular and and are matrices with proper orders; then .*

Lemma 2 (see [16]). *Let be a nonnegative definite matrix, namely, and let be some vector; then if and only if , .*

Lemma 3 (see [17]). *Let matrices be , , and , where is the largest eigenvalue of .*

In the next section, we will make comparison of the biased estimators.

#### 3. Model Specifications and the Estimators

The mean squared error matrix (MSEM) of an estimator is defined as where is the dispersion matrix and is the bias vector. For the two given estimators , , the estimator is said to be superior to the estimator in the matrix MSE criterion if and only if

Note that the MSEM criterion is always superior to the scalar mean squared error criterion (MSE); we only consider the MSEM comparisons among the estimators.

##### 3.1. MSEM Comparisons of the Class Estimator and the Stochastic Restricted (SR) Class Estimator

In this subsection, we consider the MSEM comparison between the class estimator and the stochastic restricted (SR*rk*) class estimator.

Firstly, we can compute the bias vector and the variance of stochastic restricted (SR*rk*) class estimator as follows:
where .

Therefore, the MSEM of the stochastic restricted class estimator is given by

From (6), we can compute the bias vector and the variance of class estimator as follows:

Therefore, the MSEM of the class estimator is given by

Now we consider the following difference of the MSEM:

Theorem 4. *The stochastic restricted class estimator always dominates the class estimator in the MSEM criterion.*

*Proof. *By Lemma 1, we can obtain ; then we obtain . Therefore, from (18), we may conclude that ; that is to say, the stochastic restricted class estimator always dominates the class estimator in the MSEM criterion.

##### 3.2. MSEM Comparisons of the Stochastic Restricted Ridge Estimator (SRRE) and the Stochastic Restricted (SR) Class Estimator

In this subsection, we consider the MSEM comparison between the stochastic restricted ridge estimator (SRRE) and the stochastic restricted (SR*rk*) class estimator.

Firstly, from (9), we can compute the bias vector and the variance of stochastic restricted ridge estimator (SRRE) as follows: Then, we can obtain the MSEM of the SRRE as follows: Now let us consider the following difference: where , and .

Theorem 5. *When and , then the stochastic restricted class estimator is superior to the stochastic restricted ridge estimator in the MSEM criterion if and only if , .*

*Proof. *For , it is easy to see that and . So we can conclude that .

On the other hand, can be rewritten as
From (22), it is easy to see that . Thus, by Lemma 2, we can obtain that when . So from (21) and applying Lemma 3, we have if and only if , . This theorem is proved.

##### 3.3. MSEM Comparisons of the Class Estimator and the Stochastic Restricted (SR) Class Estimator

In this subsection, we consider the MSEM comparison between the class estimator and the stochastic restricted (SR*rd*) class estimator.

Firstly, we can compute the bias vector and the variance of stochastic restricted (SR*rd*) class estimator as follows:

Therefore, we can obtain the MSEM of the stochastic restricted class estimator as follows:

From (6), we can compute the bias vector and the variance of the class estimator as follows: Then, we can obtain the MSEM of the class estimator as follows: Now we consider the following difference:

Theorem 6. *The stochastic restricted class estimator always dominates the class estimator in the MSEM criterion.*

*Proof. *By Lemma 1, we can obtain ; then we obtain . Therefore, from (27), we may conclude that ; that is to say, the stochastic restricted class estimator always dominates the class estimator in the MSEM criterion.

##### 3.4. MSEM Comparisons of the Stochastic Restricted Liu Estimator (SRLE) and the Stochastic Restricted (SR) Class Estimator

In this subsection, we consider the MSEM comparison between the stochastic restricted Liu estimator (SRLE) and the stochastic restricted (SR*rd*) class estimator.

Firstly, from (10), we can compute the bias vector and the variance of stochastic restricted Liu estimator (SRLE) as follows:

Therefore, we can obtain the MSEM of the stochastic restricted Liu estimator (SRLE) as follows:

Now we consider the following difference: where , and .

Theorem 7. *When and , then the stochastic restricted class estimator is superior to the stochastic restricted Liu estimator in the MSEM criterion if and only if , .*

*Proof. *For , it is easy to see that and . So we can conclude that .

On the other hand, can be rewritten as
From (31), it is easy to see that . Thus, by Lemma 2, we can obtain that when . So from (30) and applying Lemma 3, we have if and only if , . This theorem is proved.

*Remark*. For practical use, we may replace the unknown parameters in the theorems with their appropriate estimators.

#### 4. Numerical Example

In order to illustrate our theoretical results, we now consider in this section the data set on Total National Research and Development Expenditure as a Percent of Gross National Product originally due to Gruber [18] and later considered by Akdeniz and Erol [19]. In this paper, we use the same data, which is presented in Table 1.

Firstly, we obtain the ordinary least squares estimator of : with and . Consider the following stochastic linear restrictions:

For the class estimator (*rk*), the stochastic restricted ridge estimator (SRRE), and the stochastic restricted class estimator (SR*rk*), their estimated mean squared error values are given in Table 2. For the class estimator (*rd*), the stochastic restricted Liu estimator (SRLE) and the stochastic restricted class estimator (SR*rd*), their estimated mean squared error values are given in Table 3. Their estimated mean squared error values are got by replacing in the corresponding theoretical MSE expressions all unknown model parameters by their OLS estimator.

From Table 2, we can see that the stochastic restricted class estimator is always better than the class estimator. As fact, the stochastic restricted class estimator has more information about the unknown parameter, so this estimator is better than the class estimator. When is small, then the stochastic restricted class estimator is superior over the stochastic restricted ridge estimator. However, when becomes big, then the stochastic restricted ridge estimator is superior over the stochastic restricted class estimator.

From Table 3, we can find that stochastic restricted class estimator is always better than the class estimator. When is big, then the stochastic restricted class estimator is superior over the stochastic restricted Liu estimator. However, when becomes smaller, then the stochastic restricted Liu estimator is superior over the stochastic restricted class estimator.

#### 5. Conclusion

In this paper, the stochastic restricted class estimator (SR*rk*) and stochastic restricted class estimator (SR*rd*) are proposed and some properties of the two estimators have also been discussed. In particular, we show that the proposed SR*rk* and SR*rd* are proved to have smaller mean squared error than the *rk*, SRRE, *rd*, and SRLE, respectively.

#### Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

#### Acknowledgments

The author is grateful to the editor and the anonymous referees for their valuable comments which improved the quality of the paper. This work was supported by the Scientific Research Foundation of Chongqing University of Arts and Sciences (Grant no. R2013SC12), the National Natural Science Foundation of China (no. 11201505), and the Program for Innovation Team Building at Institutions of Higher Education in Chongqing (Grant no. KJTD201321).

#### References

- W. F. Massy, “Principal components regression in exploratory statistical research,”
*Journal of the American Statistical Association*, vol. 60, no. 309, pp. 234–266, 1965. View at Publisher · View at Google Scholar - A. E. Hoerl and R. W. Kennard, “Ridge regression: biased estimation for nonorthogonal problems,”
*Technometrics*, vol. 12, no. 1, pp. 55–67, 1970. View at Google Scholar · View at Scopus - M. R. Baye and D. F. Parker, “Combining ridge and principal component regression: a money demand illustration,”
*Communications in Statistics A*, vol. 13, no. 2, pp. 197–205, 1984. View at Publisher · View at Google Scholar · View at MathSciNet - K. J. Liu, “A new class of blased estimate in linear regression,”
*Communications in Statistics—Theory and Methods*, vol. 22, no. 2, pp. 393–402, 1993. View at Publisher · View at Google Scholar · View at MathSciNet - S. Kaçıranlar and S. Sakallıoğlu, “Combining the Liu estimator and the principal component regression estimator,”
*Communications in Statistics—Theory and Methods*, vol. 30, no. 12, pp. 2699–2705, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - M. I. Alheety and B. M. G. Kibria, “Modified Liu-type estimator based on ($r$-$k$) class estimator,”
*Communications in Statistics—Theory and Methods*, vol. 42, no. 2, pp. 304–319, 2013. View at Publisher · View at Google Scholar · View at MathSciNet - J. B. Wu, “On the performance of principal component Liu-type estimator under the mean square error criterion,”
*Journal of Applied Mathematics*, vol. 2013, Article ID 858794, 7 pages, 2013. View at Publisher · View at Google Scholar - C. R. Rao and H. Toutenburg,
*Linear Models: Least Squares and Alternatives*, Springer Series in Statistics, Springer, New York, NY, USA, 1995. View at MathSciNet - J. Durbin, “A note on regression when there is extraneous information that one of coefficients,”
*Journal of the American Statistical Association*, vol. 48, no. 264, pp. 799–808, 1953. View at Publisher · View at Google Scholar - H. Theil and A. S. Goldberger, “On pure and mixed statistical estimation in economics,”
*International Economic Review*, vol. 2, no. 1, pp. 65–78, 1961. View at Google Scholar - H. Theil, “On the use of incomplete prior information in regression analysis,”
*Journal of the American Statistical Association*, vol. 58, pp. 401–414, 1963. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - Y. L. Li and H. Yang, “A new stochastic mixed ridge estimator in linear regression model,”
*Statistical Papers*, vol. 51, no. 2, pp. 315–323, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - M. H. Hubert and P. Wijekoon, “Improvement of the Liu estimator in linear regression model,”
*Statistical Papers*, vol. 47, no. 3, pp. 471–479, 2006. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - C. L. Liu, H. Yang, and J. B. Wu, “On the weighted mixed almost unbiased ridge estimator in stochastic restricted linear regression,”
*Journal of Applied Mathematics*, vol. 2013, Article ID 902715, 10 pages, 2013. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - J. W. Xu and H. Yang, “On the restricted $r$-$k$ class estimator and the restricted $r$-$d$ class estimator in linear regression,”
*Journal of Statistical Computation and Simulation*, vol. 81, no. 6, pp. 679–691, 2011. View at Publisher · View at Google Scholar · View at MathSciNet - J. K. Baksalary and R. Kala, “Partial orderings between matrices one of which is of rank one,”
*Bulletin of the Polish Academy of Sciences: Mathematics*, vol. 31, no. 1-2, pp. 5–7, 1983. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet - S. G. Wang, M. X. Wu, and Z. Z. Jia,
*The Inequalities of Matrices*, The Education of Anhui Press, Hefei, China, 2006. - M. H. J. Gruber,
*Improving Efficiency by Shrinkage: The James-Stein and Ridge Regression Estimators*, vol. 156 of*Statistics: Textbooks and Monographs*, Marcel Dekker, New York, NY, USA, 1998. View at MathSciNet - F. Akdeniz and H. Erol, “Mean squared error matrix comparisons of some biased estimators in linear regression,”
*Communications in Statistics—Theory and Methods*, vol. 32, no. 12, pp. 2389–2413, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet