Comparison of Some Estimators under the Pitman’s Closeness Criterion in Linear Regression Model
Batah et al. (2009) combined the unbiased ridge estimator and principal components regression estimator and introduced the modified class estimator. They also showed that the modified class estimator is superior to the ordinary least squares estimator and principal components regression estimator in the mean squared error matrix. In this paper, firstly, we will give a new method to obtain the modified class estimator; secondly, we will discuss its properties in some detail, comparing the modified class estimator to the ordinary least squares estimator and principal components regression estimator under the Pitman closeness criterion. A numerical example and a simulation study are given to illustrate our findings.
Consider the following multiple linear regression model: here, is an vector of observation, is an known matrix of rank , is a vector of unknown parameters, and is an vector of disturbances with expectation and variance-covariance matrix .
The ordinary least squares estimator (OLSE) of is given as follows: The OLSE is no longer estimator in the existence of multicollinearity. So in order to reduce the multicollinearity, many remedial actions have been proposed. One popular method is considering the biased estimator. The best known biased estimator is the ridge estimator introduced by Hoerl and Kennard : As we all know , approaches 0 which is a stable but biased estimator of .
Crouse et al.  proposed the unbiased ridge estimator as a convex combination of prior information with the OLSE estimator, which is given as follows: where is a random vector with and . Özkale and Kaçiranlar  use two different ways to propose the unbiased ridge estimator and they also compared the unbiased ridge estimator with the OLSE, principal components regression estimator, ridge estimator, and class estimator under the mean squared error matrix.
Another popular way to combat the multicollinearity is the principal components regression (PCR) estimator . For this, let us consider the spectral decomposition of the matrix given as where and are diagonal matrices such that that the main diagonal elements of the matrix are the largest eigenvalues of , while are the remaining eigenvalues. The matrix is orthogonal with consisting of its first columns and consisting of the remaining columns of the matrix . The PCR estimator for can be written as
Baye and Parker  introduced the class estimator which is given as follows:
Batah et al.  combined the PCR estimator and unbiased ridge estimator and proposed the modified () class estimator: where is a random vector with and . The class estimator has the following properties:
Batah et al.  also compared the class estimator to OLSE, PCR, and class estimator in the sense of mean squared error matrix, and obtained the necessary and sufficient conditions for the class estimator superior over the OLSE and PCR.
Though mean squared error matrix has been regarded as the primary criterion for comparing different estimators, Pitman  closeness (PC) criterion has received a great deal of attention in recent years. Rao  has discussed the similarities and differences of mean squared error and PC and has aroused great interest in PC. The monograph by Keating et al.  provided an illuminating account of PC and a long list of publications on comparisons of estimators of scalar functions of univariate parameters . After that, many authors have used PC to compare estimators, such as, Wencheko  who compared some estimators under the PC criterion in linear regression model, Yang et al.  compared two linear estimators under the PC criterion, and Ahmadi and Balakrishnan [12, 13] compared some order statistic under the PC criterion. Jozani  studied the PC using the balanced loss function.
Though, in most cases, the PC criterion is more suitable for comparing estimators, in this paper, firstly, we give a new method to obtain the class estimator; then we will give the comparison of the class estimator with the OLSE and PCR; we will obtain under certain conditions that the class estimator is superior to the OLSE and PCR estimator in the PC criterion.
The rest of the paper is organized as follows. In Section 2, we will give a new method to obtain the class estimator and the comparison results are given in Section 3. In Sections 4 and 5 we will give a numerical example and a simulation study to illustrate the behaviour of the estimators, respectively. Finally, some concluding remarks are given in Section 5.
2. The Class Estimator
The handling of multicollinearity by means of PCR corresponds to the transition from the model (1) to the reduced model by omitted .
We suppose that there are stochastic linear restrictions on the parameter as where is an matrix of rank , is an vector, and is an vector of disturbances with mean and variance and covariance . is assumed to be known and positive definite. Furthermore, it is also supposed that the random vector is stochastically independent of .
Now, let us consider that the restriction (11) as . Under the idea of the PCR, the original restriction (11) becomes where . Then, Wu and Yang  introduced the following estimators: Let be a random vector. The expectation and covariance of is given as: Now, we let , , , and ; then (13) equals the class estimator, that is,
In the next section, we will give the comparison of the class estimator to the OLSE and PCR estimator under the PC criterion.
3. Superiority of the Class Estimator under the PC Criterion
Firstly, we will give the definition of the PC and PC criterion.
Definition 1. Let and be two estimators of the unknown p-dimensional vector . The PC of relative to in estimating under a loss function is defined as , where
In this paper, we consider the quadratic loss function , for a given nonnegative definite matrix .
Definition 2. is said to dominate , for all in PC (under the loss function , for some parameter space ), if
3.1. Comparison of the Class Estimator and the OLSE under the PC Criterion
Now, we give the comparison of the class estimator and the OLSE under the PC criterion.
Theorem 3. Let class estimator be given in (8) and let OLSE be given in (2), then, if where denote the median of the central distribution of with , degrees of freedom, the class estimator is superior to the OLSE under the PC criterion.
Proof. By the definition of PC criterion (),
Define and ; then we obtain
On the other hand, Then, we obtain .
Now, we let and . Thus, and . Thus (20) becomes where .
Since , on the other hand, , so Since , then ; then . (24) can be written as By the definition of unbiased ridge estimator , we have which is independent of . So we can get . By Chen (1981) and letting , then if , where denote the median of the central distribution of with , degrees of freedom.
3.2. Comparison of the Class Estimator and the PCR under the PC Criterion
Now we give the comparison of the class estimator and the PCR under the PC criterion
Theorem 4. For , the Pitman measure of closeness (PMC) of the relative to the PCR estimator is given as follows:
Proof. In this proof, we choose . Then, we have Then, we denote ; since , it is easy to compute that . Thus, we use and and , (28) can be written as For the class estimator, we may have Now, we denote . By , we get . Then, we may rewrite (30) as follows: Then, by the definition of PC criterion,
Remark 5. It is difficult to compute the values of , so, in the next section, we use a numerical example and a simulation study to compare the class estimator to the PCR estimator.
4. Numerical Example
To illustrate our theoretical results, we now consider in this section the data set on total national research and development expenditure as a percent of gross national product originally due to Gruber  and later considered by Akdeniz and Erol . In this paper, we use the same data and try to show that the class estimator is superior to the OLSE and PCR estimator. Firstly, we assemble the data as follows: Now, we can compute that with .
From Figure 1, we can see that the values of PC1 are not always bigger than 0.5; that is to say, the class estimator is not always superior to the OLSE, which is agreeing with our Theorem 3. When we see Figure 2, we may see that the values of PC2 are always bigger than 0.5; that is to say, the class estimator is always superior to the PCR.
5. Simulation Results
In order to further illustrate the behaviour of the class estimator, we are now to consider a Monte Carlo simulation by using different levels of multicollinearity in this section. The explanatory variables are generated by the following equation : where are independent standard normal pseudorandom numbers and is specified so that the correlation between any two explanatory variables is given by . Then, the observations on the dependent variable are then generated by where are independent normal pseudorandom numbers with mean zero and variance . In this simulation study, we choose , , and . The simulation results are given in Table 1
From the simulation results in Table 1, we see that, in most cases, the class estimator gives better performance than the OLSE, which agrees with our theoretical results. And the class estimator is always better than the PCR estimator. So by the numerical example and simulation study, we can see that the class estimator is better than the PCR estimator.
6. Concluding Remarks
In this paper, firstly, we give a new method to propose the class estimator. Then, we compare the class estimator to the OLSE and PCR estimators under the PC criterion. The comparison results show that, under certain conditions, the class estimator is superior to the OLSE. Finally, a numerical example and a simulation study are given to illustrate the theoretical results.
Conflict of Interests
The author declares that there is no conflict of interests regarding the publication of this paper.
The author wishes to thank the referee and Editor for helpful suggestions and comments which helped in improving the quality of this paper. This work was supported by the Scientific Research Foundation of Chongqing University of Arts and Sciences (Grant no. R2013SC12), the National Natural Science Foundation of China (Grant no. 11201505), and the Program for Innovation Team Building at Institutions of Higher Education in Chongqing (Grant no. KJTD201321).
J. P. Keating, R. L. Mason, and P. K. Sen, Pitman's Measure of Closeness: Comparison of Statistical Estimators, SIAM, Philadelphia, Pa, USA, 1993.View at: MathSciNet
M. H. J. Gruber, Improving Efficiency by Shrinkage: The James-Stein and Ridge Regression Estimators, Marcel Dekker, New York, NY, USA, 1998.View at: MathSciNet