Table of Contents Author Guidelines Submit a Manuscript
Modelling and Simulation in Engineering
Volume 2019, Article ID 6342702, 10 pages
https://doi.org/10.1155/2019/6342702
Research Article

A Modified New Two-Parameter Estimator in a Linear Regression Model

1Department of Physical Sciences, Landmark University, Omu-Aran, Nigeria
2Department of Statistics, Federal University of Technology, Akure, Nigeria
3School of Mathematical Sciences, Universiti Sains Malaysia, Malaysia

Correspondence should be addressed to Adewale F. Lukman; gn.ude.uml@imnaralof.elaweda

Received 31 December 2018; Revised 25 February 2019; Accepted 5 March 2019; Published 26 May 2019

Academic Editor: Zhiping Qiu

Copyright © 2019 Adewale F. Lukman et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The literature has shown that ordinary least squares estimator (OLSE) is not best when the explanatory variables are related, that is, when multicollinearity is present. This estimator becomes unstable and gives a misleading conclusion. In this study, a modified new two-parameter estimator based on prior information for the vector of parameters is proposed to circumvent the problem of multicollinearity. This new estimator includes the special cases of the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE). Furthermore, the superiority of the new estimator over OLSE, RRE, LE, MRE, MLE, and the two-parameter estimator proposed by Ozkale and Kaciranlar (2007) was obtained by using the mean squared error matrix criterion. In conclusion, a numerical example and a simulation study were conducted to illustrate the theoretical results.

1. Introduction

The general linear regression model in matrix form is defined aswhere y is a vector of the dependent variable, is a known full-rank matrix of explanatory variables, is a vector of regression coefficients, and is vector of disturbance such that and . The ordinary least squares estimator (OLSE) of in model (1) is defined as

According to the Gauss–Markov theorem, the OLS estimator is considered best, linear, and unbiased, possessing minimum variance in the class of all linear unbiased estimators. However, different studies have shown that the OLS estimator is not best when the explanatory variables are related, that is, when multicollinearity is present [1]. This estimator becomes unstable and gives a misleading conclusion. Many biased estimators have been proposed as an alternative to OLSE to circumvent this problem. These include Stein estimator [2], principal components estimator [3], ridge estimator (RRE) estimator [1], contraction estimator [4], modified ridge regression estimator (MRRE) [5], and Liu estimator [6].

Hoerl and Kennard [1] proposed a ridge estimator (RRE)where . was obtained by augmenting the equation to the original equation (1) and then applying the OLS estimator. Mayer and Willke [4] defined the contraction estimator

Liu [6] combined the Stein estimator with a ridge estimator to combat the problem of multicollinearity. was obtained by augmenting the equation to the original equation (1) and then applying OLS. This is defined as follows:where .

Swindel [5] modified the ridge estimator (MRRE) by adding a prior information. The estimator is defined as follows:where represent the prior information on . MRRE tends to as k tends to infinity. Also, MRRE returns the estimates of the OLS estimator when k = 0.

Based on prior information, Li and Yang [7] proposed a modified Liu estimator (MLE):

MLE includes OLS and Liu as special cases. In recent times, different researchers have suggested the use of two-parameter estimators to handle multicollinearity. Ozkale and Kaciranlar [8] proposed the two-parameter estimator (TPE), which is defined aswhere . TPE includes OLS, RRE, LE, and the contraction estimators as special cases.

The primary focus of this study is to provide an alternative method in a linear regression model to circumvent the problem of multicollinearity. A modified two-parameter (MTP) estimator is proposed based on prior information and is compared with OLS, LE, RRE, MRRE, MLE, and TPE, respectively, using the mean squared error matrix (MSEM) criterion. The article is structured as follows: We introduce the new estimator in Section 2. In Section 3, we discuss the superiority of the new estimator. Section 4 consists of a numerical example and a simulation study. Concluding remarks are provided in Section 5.

2. Modified Two-Parameter Estimator

Let and MRRE in equation (6) can be re-expressed as

Similarly, , and then the modified Liu estimator in equation (7) can be written as

MRRE and MLE are the convex combination of the prior information and the OLS estimator. From equation (8), ; therefore, the modified two-parameter based on the prior information can be defined as follows:

Also, MTPE is a convex combination of the prior information and OLSE. It includes the special cases of OLSE, RRE, MRE, LE, and MLE. The following cases are possible:; ordinary least squares estimator; Liu estimator; modified ridge estimator; ridge estimator; modified Liu estimator

Suppose there exist an orthogonal matrix T such that , where is the eigenvalue of . and are the matrices of eigenvalues and eigenvectors of , respectively. Substituting , in model (1), then the equivalent model can be rewritten as

The following representations of the estimators are as follows:

The following notations and lemmas are needful to prove the statistical property of .

Lemma 1. Let M be an positive definite matrix, that is, M > 0, and be some vector, then if and only if [9].

Lemma 2. Let be two linear estimators of . Suppose that , where denotes the covariance matrix of and . Consequently,if and only if , where [10].

3. Establishing Superiority of Modified Two-Parameter Estimator Using MSEM Criterion

In this section, MTPE is compared with the following estimators: OLS, RRE, LE, MRRE, MLE, and TPE.

3.1. Comparison between the MTPE and OLS Using MSEM Criterion

From the representation , the bias vector and covariance matrix of MTPE are obtained as follows:where .

Recall that and let . Therefore,

Hence,

From the representation, , the MSEM of OLS is given as

Comparing (18) and (19),

Let k > 0 and 0 < d < 1. Thus, the following theorem holds.

Theorem 3. Consider two biased competing homogenous linear estimators and . If k > 0 and , the estimator is superior to estimator using the MSEM criterion, that is, if and only if

Proof. Using (17) and (19), the following was obtained: will be positive definite (pd) if and only if or . It was observed that for and k > 0. Therefore, is pd. By Lemma 2, the proof is completed.

3.2. Comparison between the MTPE and RRE Using MSEM Criterion

From the representation, , the bias vector and covariance matrix of RRE is given as follows:

Hence,where . The difference between and in the MSEM sense is as follows:

Let k > 0 and 0 < d < 1. Thus, the following theorem holds.

Theorem 4. Consider two biased competing homogenous linear estimators and . If k > 0 and , the estimator is superior to estimator using the MSEM criterion, that is, if and only if

3.3. Comparison between the MTPE and LE Using MSEM Criterion

From the representation, , the bias vector and covariance matrix of RRE are provided as follows:

Hence,where . Considering the difference between (18) and (29),where , , and .

Theorem 5. Consider two biased competing homogenous linear estimators and . If k > 0 and , the estimator is superior to estimator using the MSEM criterion, that is, if and only if

Proof. Using (17) and (28), the following was obtained:
By computation, will be positive definite (pd) if and only if, . For and k > 1, it was observed that . Therefore, is pd. By Lemma 2, the proof is completed.

3.4. Comparison between the MTPE and MRRE Using MSEM Criterion

From the representation, , the bias vector and covariance matrix of MRRE are provided as follows:

Hence,where . Considering the difference between (18) and (35),

Theorem 6. Consider two biased competing homogenous linear estimators and . If k > 0 and , the estimator is superior to the estimator using the MSEM criterion, that is, if and only if

Proof. Using (17) and (34), the following was obtained:Evidently, for and k > 0, will be positive definite (pd). By Lemma 2, the proof is completed.

3.5. Comparison between the MTPE and MLE Using MSEM Criterion

From the representation, , the bias vector and covariance matrix of MLE are provided as follows:

Hence,where The mean square error difference between (18) and (41) is given aswhere , , .

Theorem 7. Consider two biased competing homogenous linear estimators and . If k > 0 and , the estimator is superior to the estimator using the MSEM criterion, that is, if and only if , where , , and .

Proof. Using (17) and (40), the following was obtained:
By computation,By computation, will be positive definite if and only if .

3.6. Comparison between the MTPE and TPE Using MSEM Criterion

From the representation , the bias vector and covariance matrix of TPE are provided as follows:

Hence,

Considering the matrix difference between (18) and (47)

Obviously, if and only if thus, the following results hold.

Theorem 8. The modified two-parameter estimator is superior to the two-parameter estimator in the MSEM sense if and only if .

4. Selection of Bias Parameters

Selecting an appropriate parameter is crucial in this study. The use of the Ridge estimator largely depends on the ridge parameter, k. Several methods for estimating this ridge parameter have been proposed. This includes Hoerl and Kennard [1], Kibria [11], Muniz and Kibria [12], Aslam [13], Dorugade [14], Kibria and Banik [15], Lukman and Ayinde [16], Lukman et al. [17], and others. For the purpose of practical application of this new estimator, the optimum values of k and d are obtained. In order to obtain an optimum value of k, we assume the value of d is fixed.

Recall from equation (18),

Differentiating equation (49) with respect to k gives the following result:

Let , the value of k is as follows: and are replaced by their unbiased estimators and . The harmonic mean version is defined aswhere .

Recall that considering this special case implies that in equation (51) will becomewhich is the estimated value of k introduced by Hoerl and Kennard [1]. Hoerl et al. [18] defined the harmonic version of the ridge parameter, k, as follows:

The optimum value of d is obtained by differentiating equation (49) with respect to d with fixed k. The result is as follows:

Let , the value of d is as follows: and are replaced by their unbiased estimators and . Recall that considering this special case implies that in equation (54) will become

Equation (57) is the same as the optimum value of d proposed by Liu [6], which is defined as follows:

Theorem 9. Iffor all i, then are always positive.

Proof. The values of k in (51) are always positive if Since must be positive for all i, it is observed that for all i. This inequality depends on the unknown parameters and which is replaced by their unbiased estimators and .
The selection of the estimator of the parameters d and k in can be obtained iteratively as follows:Step 1: calculate from (59).Step 2: estimate by using in step 1.Step 3: estimate from (56) by using the estimator in step 2.Step 4: if is negative use , can take negative value. However, takes value between 0 and 1.

5. Numerical Example and Monte-Carlo Simulation

Hussain dataset which was originally adopted by Eledum and Zahri [19] is used in this study to illustrate the performance of the new estimator. The dataset was also adopted in the study of Lukman et al. [20]. This is provided in Table 1. The regression model is defined as follows:where represents the product value in the manufacturing sector, the values of the imported intermediate commodities, imported capital commodities, represents the value of imported raw materials. The variance inflation factors are and and the condition number of is approximately 5660049. The variance inflation factor and the condition number both indicate the presence of severe multicollinearity.

Table 1

The prior information of b = 0.95 as used in the study of Li and Yang [7] is adopted. The estimated mean square values of the estimators OLSE, RRE, LE, MRRE, MLE, TPE, and MTPE are provided in Table 2. The values of k and d were computed using the estimators of k and d proposed in this study. k and d in equations (52) and (56) are obtained to be 1036.427 and 0.0043, respectively. From both tables, OLSE has the least performance among all the estimators. It was observed from Table 2 that the modified estimators (MLE, MRRE, and MTPE) outperform their counterparts. However, the proposed estimator MTPE outperforms other estimators.

Table 2: Estimated regression coefficients and mean square error of estimators.

Also, we conducted a Monte-Carlo simulation study to examine the performances of the estimators further. The simulation procedure used by Lukman and Ayinde [16] was also used to generate the explanatory variables in this study. This is given aswhere is independent standard normal distribution with mean zero and unit variance, is the correlation between any two explanatory variables, and p is the number of explanatory variables. The values of were taken as 0.85, 0.9, and 0.99, respectively. In this study, the number of explanatory variable (p) was taken to be four.

The dependent variable is generated as follows:where . The parameter values were chosen such that  = 1 which is a common restriction in simulation studies of this type [16]. The values of are taken to be β1 = 0.8, β2 = 0.1, and β3 = 0.6. Sample sizes 50 and 100 were used. Three different values of σ (0.01, 0.1, and 1) were also used. The experiment is replicated 5000 times. The estimated MSE is calculated aswhere denotes the estimate of the parameter in the replication and βi is the true parameter values. The estimated MSEs of the estimators for different values of n, p, σ, and are shown in Tables 36. The results from the simulation study show that the estimated MSE increases as the level of error variance increases. We observed that as the degree of multicollinearity (ρ) increases, the estimated MSEs also increase. Also, RRE, MRRE, LE, MLE, TPE, and MTPE have smaller MSE than the OLS estimator. The proposed estimator MTPE outperforms other estimators depending on the choice of prior information. The results of the simulation study support the real-life analysis in this paper.

Table 3: Estimated MSE values of the OLSE, RRE, MRRE, LE, and MLE when n = 50.
Table 4: Estimated MSE values of the OLSE, TPE, and MTPE when n = 50.
Table 5: Estimated MSE values of the OLSE, RRE, MRRE, LE, and MLE when n = 100.
Table 6: Estimated MSE values of the OLSE, TPE, and MTPE when n = 100.

6. Conclusions

In this article, we proposed a modified two-parameter estimator to overcome the multicollinearity problem in a linear regression model. Also, we established the superiority of this new estimator over other existing estimators in terms of matrix mean squared error criterion. This new estimator is considered to include the ordinary least squares estimator (OLSE), the ridge estimator (RRE), the Liu estimator (LE), the modified ridge estimator (MRE), and the modified Liu estimator (MLE) as special cases. Finally, a numerical example and a simulation study were conducted to illustrate the theoretical results. Results show that the performance of the proposed estimator (MTPE) is superior to others.

Data Availability

The data used to support the findings of this study are included in Table 1.

Disclosure

This manuscript is accepted for a poster session, available in the following link: http://www.isi2019.org/wp-content/uploads/2019/03/CPS-list-by-CPS-POSTER-by-CPS_Title-no-1-March-2019.pdf.

Conflicts of Interest

There are no conflicts of interest regarding the publication of this paper.

References

  1. A. E. Hoerl and R. W. Kennard, “Ridge regression: biased estimation for nonorthogonal problems,” Technometrics, vol. 12, no. 1, pp. 55–67, 1970. View at Publisher · View at Google Scholar · View at Scopus
  2. C. Stein, “Inadmissibility of the usual estimator for mean of multivariate normal distribution,” in Proceedings of the Third Berkley Symposium on Mathematical and Statistics Probability, J. Neyman, Ed., vol. 1, pp. 197–206, Berkeley, CA, USA, 1956.
  3. W. F. Massy, “Principal components regression in exploratory statistical research,” Journal of the American Statistical Association, vol. 60, no. 309, pp. 234–266, 1965. View at Publisher · View at Google Scholar
  4. L. S. Mayer and T. A. Willke, “On biased estimation in linear models,” Technometrics, vol. 15, no. 3, pp. 497–508, 1973. View at Publisher · View at Google Scholar
  5. F. F. Swindel, “Good ridge estimators based on prior information,” Communications in Statistics‐Theory and Methods, vol. 11, pp. 1065–1075, 1976. View at Google Scholar
  6. K. Liu, “A new class of biased estimate in linear regression,” Communications in Statistics-Theory and Methods, vol. 22, pp. 393–402, 1993. View at Publisher · View at Google Scholar · View at Scopus
  7. Y. Li and H. Yang, “A new Liu-type estimator in linear regression model,” Statistical Papers, vol. 53, no. 2, pp. 427–437, 2012. View at Publisher · View at Google Scholar · View at Scopus
  8. M. R. Özkale and S. Kaçiranlar, “The restricted and unrestricted two-parameter estimators,” Communications in Statistics-Theory and Methods, vol. 36, no. 15, pp. 2707–2725, 2007. View at Publisher · View at Google Scholar · View at Scopus
  9. R. W. Farebrother, “Further results on the mean square error of ridge regression,” Journal of the Royal Statistical Society: Series B (Methodological), vol. 38, no. 3, pp. 248–250, 1976. View at Publisher · View at Google Scholar
  10. G. Trenkler and H. Toutenburg, “Mean squared error matrix comparisons between biased estimators - an overview of recent results,” Statistical Papers, vol. 31, no. 1, pp. 165–179, 1990. View at Publisher · View at Google Scholar · View at Scopus
  11. B. M. G. Kibria, “Performance of some new ridge regression estimators,” Communications in Statistics - Simulation and Computation, vol. 32, no. 2, pp. 419–435, 2003. View at Publisher · View at Google Scholar · View at Scopus
  12. G. Muniz and B. M. G. Kibria, “On some ridge regression estimators: an Empirical Comparisons,” Communications in Statistics-Simulation and Computation, vol. 38, no. 3, pp. 621–630, 2009. View at Publisher · View at Google Scholar · View at Scopus
  13. M. Aslam, “Performance of Kibria’s method for the heteroscedastic ridge regression model: some Monte Carlo evidence,” Communications in Statistics-Simulation and Computation, vol. 43, no. 4, pp. 673–686, 2014. View at Publisher · View at Google Scholar · View at Scopus
  14. A. V. Dorugade, “New ridge parameters for ridge regression,” Journal of the Association of Arab Universities for Basic and Applied Sciences, vol. 15, no. 1, pp. 94–99, 2014. View at Publisher · View at Google Scholar · View at Scopus
  15. B. M. G. Kibria and S. Banik, “Some ridge regression estimators and their performances,” Journal of Modern Applied Statistical Methods, vol. 15, no. 1, pp. 206–238, 2016. View at Publisher · View at Google Scholar
  16. A. F. Lukman and K. Ayinde, “Review and classifications of the ridge parameter estimation techniques,” Hacetteppe Journal of Mathematics and Statistic, vol. 46, no. 113, p. 1, 2016. View at Publisher · View at Google Scholar · View at Scopus
  17. A. F. Lukman, K. Ayinde, and A. S. Ajiboye, “Monte Carlo study of some classification-based ridge parameter estimators,” Journal of Modern Applied Statistical Methods, vol. 16, no. 1, pp. 428–451, 2017. View at Publisher · View at Google Scholar · View at Scopus
  18. A. E. Hoerl, R. W. Kannard, and K. F. Baldwin, “Ridge regression:some simulations,” Communications in Statistics, vol. 4, no. 2, pp. 105–123, 1975. View at Publisher · View at Google Scholar
  19. H. Eledum and M. Zahri, “Relaxation method for two stages ridge regression estimator,” International Journal of Pure and Applied Mathematics, vol. 85, no. 4, pp. 653–667, 2013. View at Publisher · View at Google Scholar · View at Scopus
  20. A. F. Lukman, O. I. Osowole, K. Ayinde, and K. Ayinde, “Two stage robust ridge method in a linear regression model,” Journal of Modern Applied Statistical Methods, vol. 14, no. 2, pp. 53–67, 2015. View at Publisher · View at Google Scholar · View at Scopus