Table of Contents Author Guidelines Submit a Manuscript
Journal of Applied Mathematics
Volume 2014 (2014), Article ID 654949, 6 pages
http://dx.doi.org/10.1155/2014/654949
Research Article

Comparison of Some Estimators under the Pitman’s Closeness Criterion in Linear Regression Model

Department of Mathematics & KLDAIP, Chongqing University of Arts and Sciences, Chongqing 402160, China

Received 16 February 2014; Accepted 6 April 2014; Published 23 April 2014

Academic Editor: Renat Zhdanov

Copyright © 2014 Jibo Wu. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Batah et al. (2009) combined the unbiased ridge estimator and principal components regression estimator and introduced the modified class estimator. They also showed that the modified class estimator is superior to the ordinary least squares estimator and principal components regression estimator in the mean squared error matrix. In this paper, firstly, we will give a new method to obtain the modified class estimator; secondly, we will discuss its properties in some detail, comparing the modified class estimator to the ordinary least squares estimator and principal components regression estimator under the Pitman closeness criterion. A numerical example and a simulation study are given to illustrate our findings.

1. Introduction

Consider the following multiple linear regression model: here, is an vector of observation, is an known matrix of rank , is a vector of unknown parameters, and is an vector of disturbances with expectation and variance-covariance matrix .

The ordinary least squares estimator (OLSE) of is given as follows: The OLSE is no longer estimator in the existence of multicollinearity. So in order to reduce the multicollinearity, many remedial actions have been proposed. One popular method is considering the biased estimator. The best known biased estimator is the ridge estimator introduced by Hoerl and Kennard [1]: As we all know , approaches 0 which is a stable but biased estimator of .

Crouse et al. [2] proposed the unbiased ridge estimator as a convex combination of prior information with the OLSE estimator, which is given as follows: where is a random vector with and . Özkale and Kaçiranlar [3] use two different ways to propose the unbiased ridge estimator and they also compared the unbiased ridge estimator with the OLSE, principal components regression estimator, ridge estimator, and class estimator under the mean squared error matrix.

Another popular way to combat the multicollinearity is the principal components regression (PCR) estimator [4]. For this, let us consider the spectral decomposition of the matrix given as where and are diagonal matrices such that that the main diagonal elements of the matrix are the largest eigenvalues of , while are the remaining eigenvalues. The matrix is orthogonal with consisting of its first columns and consisting of the remaining columns of the matrix . The PCR estimator for can be written as

Baye and Parker [5] introduced the class estimator which is given as follows:

Batah et al. [6] combined the PCR estimator and unbiased ridge estimator and proposed the modified () class estimator: where is a random vector with and . The class estimator has the following properties:

Batah et al. [6] also compared the class estimator to OLSE, PCR, and class estimator in the sense of mean squared error matrix, and obtained the necessary and sufficient conditions for the class estimator superior over the OLSE and PCR.

Though mean squared error matrix has been regarded as the primary criterion for comparing different estimators, Pitman [7] closeness (PC) criterion has received a great deal of attention in recent years. Rao [8] has discussed the similarities and differences of mean squared error and PC and has aroused great interest in PC. The monograph by Keating et al. [9] provided an illuminating account of PC and a long list of publications on comparisons of estimators of scalar functions of univariate parameters [10]. After that, many authors have used PC to compare estimators, such as, Wencheko [11] who compared some estimators under the PC criterion in linear regression model, Yang et al. [10] compared two linear estimators under the PC criterion, and Ahmadi and Balakrishnan [12, 13] compared some order statistic under the PC criterion. Jozani [14] studied the PC using the balanced loss function.

Though, in most cases, the PC criterion is more suitable for comparing estimators, in this paper, firstly, we give a new method to obtain the class estimator; then we will give the comparison of the class estimator with the OLSE and PCR; we will obtain under certain conditions that the class estimator is superior to the OLSE and PCR estimator in the PC criterion.

The rest of the paper is organized as follows. In Section 2, we will give a new method to obtain the class estimator and the comparison results are given in Section 3. In Sections 4 and 5 we will give a numerical example and a simulation study to illustrate the behaviour of the estimators, respectively. Finally, some concluding remarks are given in Section 5.

2. The Class Estimator

The handling of multicollinearity by means of PCR corresponds to the transition from the model (1) to the reduced model by omitted .

We suppose that there are stochastic linear restrictions on the parameter as where is an matrix of rank , is an vector, and is an vector of disturbances with mean and variance and covariance . is assumed to be known and positive definite. Furthermore, it is also supposed that the random vector is stochastically independent of .

Now, let us consider that the restriction (11) as . Under the idea of the PCR, the original restriction (11) becomes where . Then, Wu and Yang [15] introduced the following estimators: Let be a random vector. The expectation and covariance of is given as: Now, we let , , , and ; then (13) equals the class estimator, that is,

In the next section, we will give the comparison of the class estimator to the OLSE and PCR estimator under the PC criterion.

3. Superiority of the Class Estimator under the PC Criterion

Firstly, we will give the definition of the PC and PC criterion.

Definition 1. Let and be two estimators of the unknown p-dimensional vector . The PC of relative to in estimating under a loss function is defined as , where
In this paper, we consider the quadratic loss function , for a given nonnegative definite matrix .

Definition 2. is said to dominate , for all in PC (under the loss function , for some parameter space ), if

3.1. Comparison of the Class Estimator and the OLSE under the PC Criterion

Now, we give the comparison of the class estimator and the OLSE under the PC criterion.

Theorem 3. Let class estimator be given in (8) and let OLSE be given in (2), then, if where denote the median of the central distribution of with , degrees of freedom, the class estimator is superior to the OLSE under the PC criterion.

Proof. By the definition of PC criterion (), Define and ; then we obtain Since thus, .
On the other hand, Then, we obtain .
Now, we let and . Thus, and . Thus (20) becomes where .
Since , on the other hand, , so Since , then ; then . (24) can be written as By the definition of unbiased ridge estimator [2], we have which is independent of . So we can get . By Chen (1981) and letting , then if , where denote the median of the central distribution of with , degrees of freedom.

3.2. Comparison of the Class Estimator and the PCR under the PC Criterion

Now we give the comparison of the class estimator and the PCR under the PC criterion

Theorem 4. For , the Pitman measure of closeness (PMC) of the relative to the PCR estimator is given as follows:

Proof. In this proof, we choose . Then, we have Then, we denote ; since , it is easy to compute that . Thus, we use and and , (28) can be written as For the class estimator, we may have Now, we denote . By , we get . Then, we may rewrite (30) as follows: Then, by the definition of PC criterion,

Remark 5. It is difficult to compute the values of , so, in the next section, we use a numerical example and a simulation study to compare the class estimator to the PCR estimator.

4. Numerical Example

To illustrate our theoretical results, we now consider in this section the data set on total national research and development expenditure as a percent of gross national product originally due to Gruber [16] and later considered by Akdeniz and Erol [17]. In this paper, we use the same data and try to show that the class estimator is superior to the OLSE and PCR estimator. Firstly, we assemble the data as follows: Now, we can compute that with .

Denote

Then, the values of PC1 and PC2 are computed in Figures 1 and 2, respectively.

654949.fig.001
Figure 1: The PC of class estimator relative to OLSE.
654949.fig.002
Figure 2: The PC of class estimator relative to PCR.

From Figure 1, we can see that the values of PC1 are not always bigger than 0.5; that is to say, the class estimator is not always superior to the OLSE, which is agreeing with our Theorem 3. When we see Figure 2, we may see that the values of PC2 are always bigger than 0.5; that is to say, the class estimator is always superior to the PCR.

5. Simulation Results

In order to further illustrate the behaviour of the class estimator, we are now to consider a Monte Carlo simulation by using different levels of multicollinearity in this section. The explanatory variables are generated by the following equation [18]: where are independent standard normal pseudorandom numbers and is specified so that the correlation between any two explanatory variables is given by . Then, the observations on the dependent variable are then generated by where are independent normal pseudorandom numbers with mean zero and variance . In this simulation study, we choose , , and . The simulation results are given in Table 1

tab1
Table 1: The values of for different values of and .

From the simulation results in Table 1, we see that, in most cases, the class estimator gives better performance than the OLSE, which agrees with our theoretical results. And the class estimator is always better than the PCR estimator. So by the numerical example and simulation study, we can see that the class estimator is better than the PCR estimator.

6. Concluding Remarks

In this paper, firstly, we give a new method to propose the class estimator. Then, we compare the class estimator to the OLSE and PCR estimators under the PC criterion. The comparison results show that, under certain conditions, the class estimator is superior to the OLSE. Finally, a numerical example and a simulation study are given to illustrate the theoretical results.

Conflict of Interests

The author declares that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

The author wishes to thank the referee and Editor for helpful suggestions and comments which helped in improving the quality of this paper. This work was supported by the Scientific Research Foundation of Chongqing University of Arts and Sciences (Grant no. R2013SC12), the National Natural Science Foundation of China (Grant no. 11201505), and the Program for Innovation Team Building at Institutions of Higher Education in Chongqing (Grant no. KJTD201321).

References

  1. A. E. Hoerl and R. W. Kennard, “Ridge regression: biased estimation for nonorthogonal problems,” Technometrics, vol. 12, no. 1, pp. 55–67, 1970. View at Publisher · View at Google Scholar
  2. R. H. Crouse, C. Jin, and R. C. Hanumara, “Unbiased ridge estimation with prior information and ridge trace,” Communications in Statistics: Theory and Methods, vol. 24, no. 9, pp. 2341–2354, 1995. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  3. M. R. Özkale and S. Kaçiranlar, “The restricted and unrestricted two-parameter estimators,” Communications in Statistics: Theory and Methods, vol. 36, no. 13–16, pp. 2707–2725, 2007. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  4. M. F. Massy, “Principal components regression in exploratory statistical research,” Journal of the American Statistical Association, vol. 60, no. 309, pp. 234–256, 1965. View at Publisher · View at Google Scholar
  5. M. Baye and D. Parker, “Combining ridge and principal component regression: a money demand illustration,” Communications in Statistics: Theory and Methods, vol. 13, no. 2, pp. 197–205, 1984. View at Publisher · View at Google Scholar · View at MathSciNet
  6. F. S. M. Batah, M. R. Özkale, and S. D. Gore, “Combining unbiased ridge and principal component regression estimators,” Communications in Statistics: Theory and Methods, vol. 38, no. 13–15, pp. 2201–2209, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  7. E. J. G. Pitman, “The closest estimates of statistical parameters,” Mathematical Proceedings of the Cambridge Philosophical Society, vol. 33, no. 2, pp. 212–222, 1937. View at Publisher · View at Google Scholar
  8. C. R. Rao, “Some comments on the minimum mean square error as a criterion of estimation,” in Statistics and Related Topics, pp. 123–143, North-Holland, Amsterdam, The Netherlands, 1981. View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  9. J. P. Keating, R. L. Mason, and P. K. Sen, Pitman's Measure of Closeness: Comparison of Statistical Estimators, SIAM, Philadelphia, Pa, USA, 1993. View at MathSciNet
  10. H. Yang, W. X. Li, and J. W. Xu, “Comparison of two estimators of parameters under Pitman nearness criterion,” Communications in Statistics: Theory and Methods, vol. 39, no. 17, pp. 3081–3094, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  11. E. Wencheko, “Comparison of regression estimators using Pitman measures of nearness,” Statistical Papers, vol. 42, no. 3, pp. 375–386, 2001. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  12. J. Ahmadi and N. Balakrishnan, “Pitman closeness of record values to population quantiles,” Statistics & Probability Letters, vol. 79, no. 19, pp. 2037–2044, 2009. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  13. J. Ahmadi and N. Balakrishnan, “Pitman closeness of current records for location-scale families,” Statistics & Probability Letters, vol. 80, no. 21-22, pp. 1577–1583, 2010. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  14. M. J. Jozani, “A note on Pitman’s measure of closeness with balanced loss function,” Statistics, 2012. View at Publisher · View at Google Scholar
  15. J. B. Wu and H. Yang, “Two stochastic restricted principal components regression estimator in linear regression,” Communications in Statistics: Theory and Methods, vol. 42, no. 20, pp. 3793–3804, 2013. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  16. M. H. J. Gruber, Improving Efficiency by Shrinkage: The James-Stein and Ridge Regression Estimators, Marcel Dekker, New York, NY, USA, 1998. View at MathSciNet
  17. F. Akdeniz and H. Erol, “Mean squared error matrix comparisons of some biased estimators in linear regression,” Communications in Statistics: Theory and Methods, vol. 32, no. 12, pp. 2389–2413, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet
  18. K. Liu, “Using Liu-type estimator to combat collinearity,” Communications in Statistics: Theory and Methods, vol. 32, no. 5, pp. 1009–1020, 2003. View at Publisher · View at Google Scholar · View at Zentralblatt MATH · View at MathSciNet