International Journal of Mathematics and Mathematical Sciences

Volume 2017, Article ID 2045653, 12 pages

https://doi.org/10.1155/2017/2045653

## A Note on the Performance of Biased Estimators with Autocorrelated Errors

Department of Mathematics & Statistics, Banasthali University, Rajasthan 304022, India

Correspondence should be addressed to Gargi Tyagi; moc.liamg@igrag.igayt

Received 31 July 2016; Revised 20 November 2016; Accepted 7 December 2016; Published 30 January 2017

Academic Editor: Weimin Han

Copyright © 2017 Gargi Tyagi and Shalini Chandra. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

#### Abstract

It is a well-established fact in regression analysis that multicollinearity and autocorrelated errors have adverse effects on the properties of the least squares estimator. Huang and Yang (2015) and Chandra and Tyagi (2016) studied the PCTP estimator and the class estimator, respectively, to deal with both problems simultaneously and compared their performances with the estimators obtained as their special cases. However, to the best of our knowledge, the performance of both estimators has not been compared so far. Hence, this paper is intended to compare the performance of these two estimators under mean squared error (MSE) matrix criterion. Further, a simulation study is conducted to evaluate superiority of the class estimator over the PCTP estimator by means of percentage relative efficiency. Furthermore, two numerical examples have been given to illustrate the performance of the estimators.

#### 1. Introduction

Let us consider a linear regression model aswhere is an vector of observations on dependent variable, is an full column rank matrix of observations on explanatory variables, is a vector of unknown regression coefficients, and is an vector of disturbance term with mean vector and covariance matrix .

Ordinary least squares estimator (OLSE) is one of the most widely used estimator for , given as

In the presence of multicollinearity among explanatory variables, OLSE becomes unstable and shows undesirable properties, such as inflated variance, wide confidence intervals which leads to wrong inferences and sometimes it even produces wrong signs of the estimates.

Numerous alternative methods of estimation have been designed to lower the effects of multicollinearity in literature. For instance, Stein [1] proposed stein estimator; Hoerl and Kennard [2, 3] introduced the technique of ordinary ridge regression estimator (ORRE); Massy [4] suggested principal component regression estimator (PCRE) to deal with the problem. Several authors combined two techniques of estimation in the hope that the combination will contain the advantages of the both. Baye and Parker [5] gave class estimator by combining the PCRE and the ORRE, which includes the OLSE, ORRE, and PCRE as special cases. Nomura and Ohkubo [6] obtained conditions for dominance of the class estimator over its special cases under mean squared error (MSE) criterion. Liu [7] gave an estimator by combining the advantages of the stein and ORRE, known as Liu estimator (LE). Kaçiranlar and Sakallıoğlu [8] proposed class estimator which is a combination of the LE and PCRE and showed the superiority of the class estimator over the OLSE, LE, and PCRE. Özkale and Kaçıranlar [9] proposed two-parameter estimator (TPE) by utilizing the advantages of the ORRE and LE and obtained necessary and sufficient condition for dominance of the TPE over the OLSE in MSE matrix sense. Further, Yang and Chang [10] also combined the ORR and Liu estimator in a different way and introduced an another two-parameter estimator (ATPE) and derived necessary and sufficient conditions for superiority of the ATPE over OLSE, ORRE, LE, and TPE under MSE matrix criterion. Özkale [11] put forward a general class of estimators, class estimator which is a mingle of the TPE [9] and PCRE; they evaluated the performance of the class estimator under MSE criterion. Chang and Yang [12] suggested another general class of estimators by merging the PCRE and ATPE [10] named as principal component two-parameter estimator (PCTPE) and analyzed its performance under MSE matrix sense.

In applied work, it is quite common to have autocorrelation in error terms; that is, , where is a known symmetric positive definite (p.d.) matrix and it is well known to statisticians that autocorrelated errors reduce the efficiency of the OLSE. Now, since is a symmetrical positive definite matrix there exists an orthogonal matrix such that . On premultiplying model (1) by , we haveNote that and .

To overcome the effect of autocorrelated errors, Aitken [13] proposed the generalized least squares estimator (GLSE) for in (1) which can be obtained by applying least squares technique in model (3) as

It has been observed that the problem of autocorrelation and multicollinearity arise simultaneously in several cases. Keeping this in mind, a good amount of literature has been devoted to study these problems simultaneously by Trenkler [14], Firinguetti [15], G. M. Bayhan and M. Bayhan [16], Alheety and Kibria [17], Özkale [18], Güler and Kaçiranlar [19], Alkhamisi [20], Yang and Wu [21], Eledum and Alkhalifa [22], Şiray [23], and Chandra and Sarkar [24], to name a few.

Further, to define the estimators, let be a orthogonal matrix with , where is a diagonal matrix of eigenvalues of matrix such that . Now, let be orthogonal matrix after deleting last columns from matrix, where . Thus, where and , where . Also, .

Chandra and Tyagi [25] modified the class estimator [11] to address multicollinearity and autocorrelated errors simultaneously, which is expressed aswhere , .

Huang and Yang [26] proposed PCTP estimator in the presence of autocorrelated errors as

Some other biased estimators in the presence of multicollinearity and autocorrelation can be obtained as special cases. is the class estimator proposed by Şiray et al. [23]; is the GLSE by Aitken [13]; is the ridge regression estimator (RRE) given by Trenkler [14] and so forth. The special cases of the estimators have been compared with the class estimator and the PCTP estimator by Chandra and Tyagi [25] and Huang and Yang [26], respectively. Hence, this paper focuses on the comparison of the performance of the two general estimators.

Further, the rest of the paper is organized as follows: the necessary and sufficient condition for dominance of the PCTP estimator over the class estimator under the MSE matrix criterion has been derived in Section 2. Section 3 is devoted to simulation study to compare these estimators under MSE criterion. Some methods of selection of the unknown biasing parameters have been given in Section 4. Section 5 includes two numerical examples. Finally, the paper is summed up in Section 6 with some concluding remarks.

#### 2. MSE Matrix Comparison of and

The MSE matrix criterion is a strong and one of the most widely used criteria for comparison of the estimators. Let be an estimator of ; then the expression for the MSE matrix is given aswhere and are the covariance matrix and bias vector of .

From (5) and (6), the covariance matrices and bias vectors of the estimator and PCTP estimator can be obtained aswhere and

Thus, the MSE matrices of the estimators can be given asTo compare the performance of these estimators, the difference of the MSE matrices can be obtained aswhere , ,

On further simplification, can be written asIt is easy to note that is positive definite. For the convenience of the derivation of the dominance conditions, we state the following Lemma.

Lemma 1. *Assume that , are two competing linear estimators of . Suppose that , where , denotes the covariance matrix of . Then if and only if , where denote the bias vector of .*

From the expressions in (5) and (6), it is easy to verify that the class estimator and the PCTP estimator can be written as and , where and . Further, it is evident from (12) that is a positive definite matrix. Thus from the above lemma, if and only ifHence, the comparison under MSE matrix can be concluded in the following theorem.

Theorem 2. *The PCTP estimator dominates the class estimator in MSE matrix sense if and only if .*

#### 3. Selection of and

It is an important problem to find optimum value of the biasing parameters. A general approach to select an optimum value of the biasing parameters is to minimize the scalar MSE of the estimator.

##### 3.1. For

The scalar MSE of the class estimator can be obtained by taking trace of the MSE matrix in (9), which is given aswhere * i*th component of . The optimum value of for a fixed and can be obtained by differentiating with respect to and equating it to zero. Further, first derivative of with respect to for fixed and is obtained asOn equating to zero, we get the value of for the class estimator asBy taking harmonic mean as suggested by Hoerl et al. [27] and arithmetic mean and geometric mean [28] of the values in (16), we propose the following estimators:Further, the positiveness of can be ensured when we have . It can be noted that when , is not defined and will give zero. This way we can choose a value of satisfying and further the value of is obtained by replacing in (17).

Alternatively, for fixed and , the optimum value of for the class estimator by minimizing with respect to is obtained asClearly, is positive when . Hence, we can choose a value of and making use of this value we can find optimum value .

##### 3.2. For

The scalar MSE of the PCTP estimator obtained by taking trace of the MSE matrix in (10) is given asThe first order derivative of is obtained as

The optimum value of for the PCTP estimator is obtained as

Since depends on , following Hoerl et al. [27] and Kibria [28], we propose the following estimators:Further, the positiveness of can be ensured when we have . This way we can chose a value of satisfying for the PCTP estimator and the value of is then obtained by replacing in (22).

Alternatively, for fixed and , the optimum values of for the PCTP estimator by minimizing with respect to are obtained asWhen for some , forWhen for all , for .

Further, the values of and can be easily obtained by replacing the unknown parameters and with their unbiased estimators.

#### 4. Monte Carlo Study

In this section, we will evaluate the performance of the estimators through Monte Carlo simulation. Following McDonald and Galarneau [29] and Gibbons [30], matrix has been generated as follows:where are generated from standard normal pseudorandom numbers and ’s are generated such that the correlation between any pair of -variables is . In this study, we consider the values of to be 0.90, 0.95, and 0.99. Following McDonald and Galarneau [29], Gibbons [30], Kibria [28], and others, has been chosen as the normalized eigenvector corresponding to the largest eigenvalue of the matrix. The dependent variable is obtained by

Following Firinguetti [15], Judge et al. [31], and Chandra and Sarkar [24], are generated from AR(1) process aswhere are independent normal pseudorandom numbers with mean 0 and variance and is autoregressive coefficient such that . The covariance matrix for AR(1) errors is given byThe value of is decided by a scree plot which is drawn between eigenvalues and components (see Johnson and Wichern [32]). In this simulation we chose , , , , , and . Then the experiment is repeated 2000 times by generating errors in every repetition and estimated MSE (EMSE) is calculated by the following formula:where is the estimated value of in ith iteration. To compare the performances of the estimators, percentage relative efficiency of the class estimator over the PCTP estimator has been calculated as follows:

For brevity, we have reported some selected results in Tables 1–3, where , , and give percent relative efficiency of the class estimator over the PCTP estimator when takes values , and , respectively.