Abstract

This paper is concerned with the parameter estimator in linear regression model. To overcome the multicollinearity problem, two new classes of estimators called the almost unbiased ridge-type principal component estimator (AURPCE) and the almost unbiased Liu-type principal component estimator (AULPCE) are proposed, respectively. The mean squared error matrix of the proposed estimators is derived and compared, and some properties of the proposed estimators are also discussed. Finally, a Monte Carlo simulation study is given to illustrate the performance of the proposed estimators.

1. Introduction

Consider the following multiple linear regression model: where is an vector of responses, is an known design matrix of rank , is a vector of unknown parameters, is an vector of disturbances assumed to be distributed with mean vectorand variance covariance matrix , and is an identity matrix of order .

According to the Gauss-Markov theorem, the ordinary least squares estimate (OLSE) of (1) is obtained as follows:

It has been treated as the best estimator for a long time. However, many results have proved that the OLSE is no longer a good estimator when the multicollinearity is present. To overcome this problem, many new biased estimators have been proposed, such as principal components regression estimator (PCRE) [1], ridge estimator [2], Liu estimator [3], almost unbiased ridge estimator [4], and the almost unbiased Liu estimator [5].

To hope that the combination of two different estimators might inherit the advantages of both estimators, Kaçıranlar et al. [6] improved Liu’s approach and introduced the restricted Liu estimator. Akdeniz and Erol [7] compared some biased estimators in linear regression in the mean squared error matrix (MSEM) sense. By combining the mixed estimator and Liu estimator, Hubert and Wijekoon [8] obtained the two-parameter estimator which is a general estimator including the OLSE, ridge estimator, and Liu estimator. Baye and Parker [9] proposed the class estimator which includes as special cases the PCRE, the RE, and the OLSE. Then, Kaçıranlar and Sakallıoğlu [10] proposed the estimator which is a generalization of the OLSE, PCRE, and Liu estimator. Based on the estimator and estimator, Xu and Yang [11] considered the restricted estimator and restricted estimator and Wu and Yang [12] introduced the stochastic restricted estimator and the stochastic restricted estimator, respectively.

The primary aim in this paper is to introduce two new classes of estimators where one includes the OLSE, PCRE, and AURE as special cases and the other one includes the OLSE, PCRE, and AULE as special cases and provide some alternative methods to overcome multicollinearity in linear regression.

The paper is organized as follows. In Section 2, the new estimators are introduced. In Section 3, some properties of the new estimator are discussed. Then we give a Monte Carlo simulation in Section 4. Finally, some conclusions are given in Section 5.

2. The New Estimators

In the linear model given by (1), the almost unbiased ridge estimator (AURE) proposed by Singh et al. [4] and the almost unbiased Liu estimator (AULE) proposed by Akdeniz and Kaçıranlar [5] are defined as respectively, where , .

Now consider the spectral decomposition of the matrix given as where , and are the ordered eigenvalues of . The matrix is orthogonal with consisting of its first columns and consisting of the remaining columns of the matrix . Then ; the PCRE of can be written as

The class estimator proposed by Baye and Parker [9] and the class estimator proposed by Kaçıranlar and Sakallıoğlu [10] are defined as

Followed by Xu and Yang [11], the class estimator and class estimator can be rewritten as follows: where is the ridge estimator by Hoerl and Kennard [2] and is the Liu estimator proposed by Liu [3].

Now, we are to propose two new estimator classes by combining the PCRE with the AURE and AULE, that is, the almost unbiased ridge principal components estimator (AURPCE) and the almost unbiased Liu estimator principal component estimator (AULPCE), as follows: respectively, where , .

From the definition of the AURPCE, we can easily obtain the following.If, then .If, , then.If, then.

From the definition of the SRAULPCE, we can similarly obtain the following.If, then.If, , then.If, then.

So the could be regarded as a generalization of PCRE, OLSE, and AURE, while could be regarded as a generalization of PCRE, OLSE, and AULE.

Furthermore, we can compute that the bias, dispersion matrix, and mean squared error matrix of the new estimators are respectively.

In a similar way, we can get the MSEM of the as follows: In particular, if we let in (12) and (13), then we can get the MSEM of the AURE and AULE as follows:

3. Superiority of the Proposed Estimators

For the sake of convenience, we first list some notations, definitions, and lemmas needed in the following discussion. For a matrix , , , , , and stand for the transpose, Moore-Penrose inverse, rank, column space, and null space, respectively. means that is nonnegative definite and symmetric.

Lemma 1. Let be the set of complex matrices, let be the subset of consisting of Hermitian matrices, and , and stand for the conjugate transpose, the range, and the set of all generalized inverses, respectively. Let and be linearly independent, , , and if , let . Then if and only if one of the following sets of conditions holds:(a); (b); (c), ,
where is a subunitary matrix ( possibly absent), a positive-definite diagonal matrix (occurring when is present), and a positive scalar. Further, all expressions in (a), (b), and (c) are independent of the choice of .

Proof. Lemma 1 is due to Baksalary and Trenkler [13].
Let us consider the comparison between the AURPCE and AURE and the AULPCE and AULE, respectively. From (12)–(14), we have where , and   , , , .
Now, we will use Lemma 1 to discuss the differences and following Sarkar [14] and Xu and Yang [11]. Since we assume that and is invertible; then Meanwhile, it is noted that the assumptions are reasonable which is equivalent to the partitioned matrix , that is, a block diagonal matrix and the second main diagonal being invertible.

Theorem 2. Suppose that and is invertible; then the AURPCE is superior to the AURE if and only if , where .

Proof. Since then we have And the Moore-Penrose inverse of is Note that , , is a positive definition matrix since is supposed to be invertible and , so . Moreover, where . This implies that . So the conditions of part (b) in Lemma 1 can be employed. Since and , it is concluded that in our case. Thus, it follows from Lemma 1 that the is superior to in the MSEM sense if and only if . Observing that where , thus the necessary and sufficient condition turns out to be.

Theorem 3. Suppose that and is invertible; then the new estimator AULPCE is superior to the AULE if and only if , where .

Proof. In order to apply Lemma 1, we can similarly compute that Therefore, the Moore-Penrose inverse of is given by Since , then. Moreover, where . This implies that . So the conditions of part (b) in Lemma 1 can be employed. Since and , it is concluded that in our case. Thus, it follows from Lemma 1 that the is superior to in the MSEM sense if and only if . Observing that where , thus the necessary and sufficient condition turns out to be .

4. Monte Carlo Simulation

In order to illustrate the behaviour of the AURPCE and AULPCE, we perform a Monte Carlo simulation study. Following the way of Li and Yang [15], the explanatory variables and the observations on the dependent variable are generated by where are independent standard normal pseudorandom numbers and is specified so that the correlation between any two explanatory variables is given by . In this experiment, we choose and . Let us consider the AURPCE, AULPCE, AURE, AULE, PCRE, and OLSE and compute their respective estimated MSE values with the different levels of multicollinearity, namely, to show the weakly, strong, and severely collinear relationships between the explanatory variables (see Tables 1 and 2). Furthermore, for the convenience of comparison, we plot the estimated MSE values of the estimators when in Figure 1.

From the simulation results shown in Tables 1 and 2 and the estimated MSE values of these estimators, we can see that for most cases, the AURPCE and AULPCE have smaller estimated MSE values than those of the AURE, AULE, PCRE, and OLSE, respectively, which agree with our theoretical findings. From Figure 1, the AURPCE and AULPCE also have more stable and smaller estimated MSE values. We can see that our estimator is meaningful in practice.

5. Conclusion

In this paper, we introduce two classes of new biased estimators to provide an alternative method of dealing with multicollinearity in the linear model. We also show that our new estimators are superior to the competitors in the MSEM criterion under some conditions. Finally, a Monte Carlo simulation study is given to illustrate the better performance of the proposed estimators.

Conflict of Interests

The authors declare that there is no conflict of interests regarding the publication of this paper.

Acknowledgments

This work was supported by the National Natural Science Foundation of China (no. 11201505) and the Fundamental Research Funds for the Central Universities (no. 0208005205012).