Table of Contents Author Guidelines Submit a Manuscript
Journal of Probability and Statistics
Volume 2018, Article ID 1452181, 8 pages
https://doi.org/10.1155/2018/1452181
Research Article

Stochastic Restricted Biased Estimators in Misspecified Regression Model with Incomplete Prior Information

1Department of Physical Science, Vavuniya Campus, University of Jaffna, Vavuniya, Sri Lanka
2Postgraduate Institute of Science, University of Peradeniya, Peradeniya, Sri Lanka
3Department of Statistics and Computer Science, University of Peradeniya, Peradeniya, Sri Lanka

Correspondence should be addressed to Manickavasagar Kayanan; kl.ca.nfj.uav@nanayagm

Received 17 December 2017; Accepted 1 March 2018; Published 10 April 2018

Academic Editor: Aera Thavaneswaran

Copyright © 2018 Manickavasagar Kayanan and Pushpakanthie Wijekoon. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

The analysis of misspecification was extended to the recently introduced stochastic restricted biased estimators when multicollinearity exists among the explanatory variables. The Stochastic Restricted Ridge Estimator (SRRE), Stochastic Restricted Almost Unbiased Ridge Estimator (SRAURE), Stochastic Restricted Liu Estimator (SRLE), Stochastic Restricted Almost Unbiased Liu Estimator (SRAULE), Stochastic Restricted Principal Component Regression Estimator (SRPCRE), Stochastic Restricted - (SRrk) class estimator, and Stochastic Restricted - (SRrd) class estimator were examined in the misspecified regression model due to missing relevant explanatory variables when incomplete prior information of the regression coefficients is available. Further, the superiority conditions between estimators and their respective predictors were obtained in the mean square error matrix (MSEM) sense. Finally, a numerical example and a Monte Carlo simulation study were used to illustrate the theoretical findings.

1. Introduction

Misspecification due to left out relevant explanatory variables is very often when considering the linear regression model, which causes these variables to become a part of the error term. Consequently, the expected value of error term of the model will not be zero. Also, the omitted variables may be correlated with the variables in the model. Therefore, one or more assumptions of the linear regression model will be violated when the model is misspecified, and hence the estimators become biased and inconsistent. Further, it is well-known that the ordinary least squares estimator (OLSE) may not be very reliable if multicollinearity exists in the linear regression model. As a remedial measure to solve multicollinearity problem, biased estimators based on the sample model with prior information which can be exact or stochastic restrictions have received much attention in the statistical literature. The intention of this work is to examine the performance of the recently introduced stochastic restricted biased estimators in the misspecified regression model with incomplete prior knowledge about regression coefficients when there exists multicollinearity among explanatory variables.

When we consider the biased estimation in misspecified regression model without any restrictions on regression parameters, Sarkar [1] discussed the consequences of exclusion of some important explanatory variables from a linear regression model when multicollinearity exists. Şiray [2] and Wu [3] examined the efficiency of the - class estimator and - class estimator over some existing estimators, respectively, in the misspecified regression model. Chandra and Tyagi [4] studied the effect of misspecification due to the omission of relevant variables on the dominance of the class estimator. Recently, Kayanan and Wijekoon [5] examined the performance of existing biased estimators and the respective predictors based on the sample information in a misspecified linear regression model without considering any prior information about regression coefficients.

It is recognized that the mixed regression estimator (MRE) introduced by Theil and Goldberger [6] outperforms ordinary least squares estimator (OLSE) when the regression model is correctly specified. The biased estimation with stochastic linear restrictions in the misspecified regression model due to inclusion of an irrelevant variable with the incorrectly specified prior information was discussed by Teräsvirta [7]. Later Mittelhammer [8], Ohtani and Honda [9], Kadiyala [10], and Trenkler and Wijekoon [11] discussed the efficiency of MRE under misspecified regression model due to exclusion of a relevant variable with correctly specified prior information. Further, the superiority of MRE over the OLSE under the misspecified regression model with incorrectly specified sample and prior information was discussed by Wijekoon and Trenkler [12]. Hubert and Wijekoon [13] have considered the improvement of Liu estimator (LE) under a misspecified regression model with stochastic restrictions and introduced the Stochastic Restricted Liu Estimator (SRLE).

In this paper, the performance of the recently introduced stochastic restricted estimators, namely, the Stochastic Restricted Ridge Estimator (SRRE) proposed by Li and Yang [14], Stochastic Restricted Almost Unbiased Ridge Estimator (SRAURE), and Stochastic Restricted Almost Unbiased Liu Estimator (SRAULE) proposed by Wu and Yang [15], Stochastic Restricted Principal Component Regression Estimator (SRPCRE) proposed by He and Wu [16], Stochastic Restricted - (SRrk) class estimator, and Stochastic Restricted - (SRrd) class estimator proposed by Wu [17], was examined in the misspecified regression model when multicollinearity exists among explanatory variables. Further, a generalized form to represent these estimators is also proposed.

The rest of this article is organized as follows. The model specification and the estimators are written in Section 2. In Section 3, the mean square error matrix (MSEM) comparison between two estimators and respective predictors is considered. In Section 4, a numerical example and a Monte Carlo simulation study are given to illustrate the theoretical results in Scalar Mean Square Error (SMSE) criterion. Finally, some concluding remarks are mentioned in Section 5. The references and appendixes are given at the end of the paper.

2. Model Specification and the Estimators

Assume that the true regression model is given by where is the vector of observations on the dependent variable, and are the and matrices of observations on the regressors, and are the and vectors of unknown coefficients, and is the vector of disturbances such that and .

Let us say that the researcher misspecifies the regression model by excluding regressors as Let us also assume that there exists prior information on in the form ofwhere is the vector, is the given matrix with rank , is the unknown fixed vector, is the vector of disturbances such that , , where is positive definite, and

By combining sample model (2) and prior information (3), Theil and Goldberger [6] proposed the mixed regression estimator (MRE) asTo combat multicollinearity, several researchers introduce different types of stochastic restricted estimators in place of MRE. Seven such estimators are SRRE, SRAURE, SRLE, SRALUE, SRPCRE, SRrk class estimator, and SRrd class estimator defined below, respectively:where , , and are the first columns of which is an orthogonal matrix of the standardized eigenvectors of .

According to Kadiyala [10], now we apply the simultaneous decomposition to the two symmetric matrices and , aswhere is a positive definite matrix and is a positive semidefinite matrix, is a nonsingular matrix, and is a diagonal matrix with eigenvalues for and for .

Let , , , , and ; then the models (1), (2), and (3) can be written asAccording to Wijekoon and Trenkler [12], the corresponding MRE is given byHence, the respective expectation vector, bias vector, and dispersion matrix are given byIn the case of misspecification, now the SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd for model (7) can be written asrespectively, where , , , , , , and .

It is clear that , , , and are positive definite and , , and are nonnegative definite.

Since all these estimators can be written by incorporating , now we write a generalized form to represent SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd as given below: where is positive definite matrix if it stands for , , , and , and it is nonnegative definite matrix if it stands for , , and .

Now the expectation vector, bias vector, the dispersion matrix, and the mean square error matrix can be written aswhere and .

Based on (14), the respective bias vector, dispersion matrix, and MSEM of the MRE, SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd can easily be obtained and are given in Table B1 in Appendix B.

By using the approach of Kadiyala [10] and (3) and (4), the generalized prediction function can be defined as follows:where is the actual value and is the corresponding predictor.

The MSEM of the generalized predictor is given byNote that the predictors based on the MRE, SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd are denoted by , , , , , , , and , respectively.

3. Mean Square Error Matrix (MSEM) Comparisons

If two generalized biased estimators and are given, the estimator is said to be superior to   with respect to MSEM sense if and only if . Also, if two generalized predictors and are given, the predictor is said to be superior to with respect to MSEM sense if and only if .

Now let , , , and .

By applying Lemma A1 (see Appendix A), the following theorem can be stated for the superiority of over with respect to the MSEM criterion.

Theorem 1. If is positive definite, then is superior to in MSEM sense when the regression model is misspecified due to excluding relevant variables if and only if

Proof. Let be a positive definite matrix. According to Lemma A1 (see Appendix A), is nonnegative definite matrix if . This completes the proof.

The following theorem can be stated for the superiority of over with respect to the MSEM criterion.

Theorem 2. If , is superior to in MSEM sense when the regression model is misspecified due to excluding relevant variables if and only if and , where , , and stands for column space of and is an independent choice of -inverse of .

Proof. According to (16), we can write asAfter some straight forward calculation, it can be written aswhere and .
Due to Lemma A3 (see Appendix A), is nonnegative definite matrix if and only if , and , where stands for column space of and is an independent choice of -inverse of . This completes the proof.

Based on Theorems 1 and 2, we can define Corollaries C1C28, written in Appendix C, for the superiority conditions between two selected estimators and for the respective predictors by substituting the relevant expressions for , , , and given in Table B1 in Appendix B.

4. Illustration of Theoretical Results

4.1. Numerical Example

To illustrate the theoretical results, we considered the dataset which gives the total National Research and Development Expenditures as a Percent of Gross National Product by Country from 1972 to 1986. The dependent variable of this dataset is the percentage spent by the United States, and the four other independent variables are , , , and . The variable represents the percent spent by the former Soviet Union, that spent by France, that spent by West Germany, and that spent by the Japan. The data has been analysed by Gruber [18], Akdeniz and Erol [19], and Li and Yang [14], among others. Now we assemble the data as follows: Note that the eigenvalues of are 302.96, 0.728, 0.044, and 0.035, the condition number is 93, and the Variance Inflation Factor (VIF) values are 6.91, 21.58, 29.75, and 1.79. This implies the existence of serious multicollinearity in the dataset.

The corresponding OLS estimator of is and the estimate of is . In this example, we consider and . The SMSE values of the estimators are summarized in Tables B2-B3 in Appendix B.

Table B2 shows the estimated SMSE values of MRE, SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd for the regression model when , , and with respect to shrinkage parameters , where denotes the number of variables in the model and denotes the number of misspecified variables. Table B3 shows the estimated SMSE values of the predictor of MRE, SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd for the regression model when , , and for some selected shrinkage parameters .

Note that when the model is correctly specified, when one variable is omitted from the model, and when two variables are omitted from the model. For simplicity, we choose shrinkage parameter values and in the range .

From Table B2, we can observe that the MRE is superior to the other estimators when and SRAULE, SRRE, SRLE, and SRAURE outperform the other estimators for , , , and , respectively, when . Similarly, SRLE and SRRE are superior to the other estimators for and , respectively, when .

From Table B3, we further observe that predictors based on SRLE and SRRE outperform the other predictors for and , respectively, when and , and predictors based on SRrd and SRrk are superior to the other predictors for and , respectively, when .

4.2. Simulation

For further clarification, a Monte Carlo simulation study is done at different levels of misspecification using R 3.2.5. Following McDonald and Galarneau [20], we can generate the explanatory variables as follows:where is an independent standard normal pseudorandom number and is specified so that the theoretical correlation between any two explanatory variables is given by . A dependent variable is generated by using the following equation:where is a normal pseudorandom number with mean zero and variance one. Also, we select as the normalized eigenvector corresponding to the largest eigenvalue of for which 1. Further we choose and .

Then the following setup is considered to investigate the effects of different degrees of multicollinearity on the estimators:(i), condition number = 9.49, and VIF = .(ii), condition number = 34.77, and VIF = .(iii), condition number = 115.66, and VIF = .

Three different sets of observations are considered by selecting , , and when , where denotes the number of variables in the model and denotes the number of misspecified variables. Note that when the model is correctly specified, when one variable is omitted from the model, and when two variables are omitted from the model. For simplicity, we select values and in the range .

The simulation is repeated 2000 times by generating new pseudorandom numbers and the simulated SMSE values of the estimators and predictors are obtained using the following equations:The simulation results are summarized in Tables B4B9 in Appendix B.

Tables B4, B5, and B6 show the estimated SMSE values of the estimators for the regression model when , , and and , and for the selected values of shrinkage parameters , respectively. Tables B7, B8, and B9 show the corresponding estimated SMSE values of the predictors for the above regression models, respectively.

From Table B4, we can observe that MRE and SRAULE outperform the other estimators for and , respectively, when and . Further, SRLE and SRRE are superior to the other estimators for and , respectively, when under .

From Table B5, we can observe that SRAULE, MRE, and SRAURE outperform the other estimators for , , and , respectively, when . Similarly, SRAULE, SRRE, SRLE, and SRAURE are superior to the other estimators when , , , and , respectively, when , and both SRLE and SRRE outperform the other estimators for and , respectively, when and .

The results in Table B6 indicate that MRE is superior to the other estimators when , and SRAULE, SRRE, SRLE, and SRAURE outperform the other estimators for , , , and , respectively, when . Further, SRLE and SRRE outperform the other estimators for and , respectively, when and .

From Tables B7B9, we further observe that the predictors based on SRrd and SRrk always outperform the other predictors for and , respectively, when , , and .

The SMSE values of the selected estimators are plotted with different ρ values to demonstrate the results graphically when . Figures 13 show the graphical illustration of the performance of estimators in the misspecified regression model when , , and , respectively. Similarly, Figures 46 present the graphical illustration of the performance of predictors in the misspecified regression model when , , and , respectively.

Figure 1: SMSE values of the estimators in the misspecified regression model when and .
Figure 2: SMSE values of the estimators in the misspecified regression model when and .
Figure 3: SMSE values of the estimators in the misspecified regression model when and
Figure 4: SMSE values of the predictors in the misspecified regression model when and .
Figure 5: SMSE values of the predictors in the misspecified regression model when and .
Figure 6: SMSE values of the predictors in the misspecified regression model when and .

5. Conclusion

Theorems 1 and 2 give the common form of superiority conditions to compare the estimators (MRE, SRRE, SRAURE, SRLE, SRAULE, SRPCRE, SRrk, and SRrd) and their respective predictors in MSEM criterion in the misspecified linear regression model when the prior information of the regression coefficients is incomplete, and the multicollinearity exists among the explanatory variables.

From the simulation study, the superior estimators and predictors over the others when the conditions are different can be identified. The results obtained in this research will produce significant improvements in the parameter estimation in misspecified regression models with incomplete prior information, and the results are applicable to real-world applications.

Conflicts of Interest

The authors declare that they have no conflicts of interest.

Supplementary Materials

Supplementary material contains three appendix sections named Appendices  A,  B, and  C. Appendix  A: Lemmas  A1–A3, used to prove the theorems. Appendix  B: Tables B1–B9, which show the stochastic properties of estimators and results of numerical example and simulation study. Appendix  C: Corollaries  C1–C28. (Supplementary Materials)

References

  1. N. Sarkar, “Comparisons among some estimators in misspecified linear models with multicollinearity,” Annals of the Institute of Statistical Mathematics, vol. 41, no. 4, pp. 717–724, 1989. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  2. G. Şiray, “r-d class estimator under misspecification,” Communications in Statistics—Theory and Methods, vol. 44, no. 22, pp. 4742–4756, 2015. View at Publisher · View at Google Scholar · View at MathSciNet
  3. J. Wu, “Superiority of the r-k class estimator over some estimators in a misspecified linear model,” Communications in Statistics—Theory and Methods, vol. 45, no. 5, pp. 1453–1458, 2016. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  4. S. Chandra and G. Tyagi, “On the performance of some biased estimators in a misspecified model with correlated regressors,” in STATISTICS IN TRANSITION new series, pp. 27–52, 2017. View at Publisher · View at Google Scholar
  5. M. Kayanan and P. Wijekoon, “Performance of Existing Biased Estimators and the respective Predictors in a Misspecified Linear Regression Model,” Open Journal of Statistics, pp. 876–900, 2017. View at Google Scholar
  6. H. Theil and A. S. Goldberger, “On Pure and Mixed Statistical Estimation in Economics,” International Economic Review, vol. 2, no. 1, pp. 65–78, 1961. View at Google Scholar
  7. T. Teräsvirta, Linear restrtctions in misspecified linear models and polynomial distributed lag estimation, vol. 5, Departntent of Statistics University of Helsinki, Helsinki, Finland, 1980. View at Publisher · View at Google Scholar
  8. R. C. Mittelhammer, “On specification error in the general linear model and weak mean square error superiority of the mixed estimator,” Communications in Statistics—Theory and Methods, vol. 10, no. 2, pp. 167–176, 1981. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  9. K. Ohtani and Y. Honda, “On small sample properties of the mixed regression predictor under misspecification,” Communications in Statistics - Theory and Methods, pp. 2817–2825, 1984. View at Google Scholar
  10. K. Kadiyala, “Mixed regression estimator under misspecification,” Economics Letters, vol. 21, no. 1, pp. 27–30, 1986. View at Publisher · View at Google Scholar · View at MathSciNet · View at Scopus
  11. G. Trenkler and P. Wijekoon, “Mean square error matrix superiority of the mixed regression estimator under misspecification,” Statistica, vol. 49, no. 1, pp. 65–71, 1989. View at Google Scholar
  12. P. Wijekoon and G. Trenkler, “Mean Square Error Matrix Superiority of Estimators under Linear Restrictions and Misspecification,” Economics Letters, vol. 30, pp. 141–149, 1989. View at Publisher · View at Google Scholar
  13. M. Hubert and P. Wijekoon, “Superiority of the stochastic restricted Liu estimator under misspecification,” Statistica, vol. 64, no. 1, pp. 153–162, 2004. View at Google Scholar
  14. Y. Li and H. Yang, “A new stochastic mixed ridge estimator in linear regression model,” Statistical Papers, pp. 315–323, 2010. View at Google Scholar
  15. J. Wu and H. Yang, “On the stochastic restricted almost unbiased estimators in linear regression model,” Communications in Statistics - Simulation and Computation, vol. 43, no. 2, pp. 428–440, 2014. View at Publisher · View at Google Scholar
  16. D. He and Y. Wu, “A Stochastic Restricted Principal Components Regression Estimator in the Linear Model,” The Scientific World Journal, vol. 2014, Article ID 231506, 6 pages, 2014. View at Publisher · View at Google Scholar
  17. J. Wu, “On the Stochastic Restricted r-k Class Estimator and Stochastic Restricted r-d Class Estimator in Linear Regression Model,” Journal of Applied Mathematics, vol. 2014, Article ID 173836, 6 pages, 2014. View at Publisher · View at Google Scholar · View at MathSciNet
  18. M. H. Gruber, Improving Efficiency by Shrinkage, vol. 156 of Statistics: Textbooks and Monographs, Marcel Dekker, New York, NY, USA, 1998. View at MathSciNet
  19. F. Akdeniz and H. Erol, “Mean Squared Error Matrix Comparisons of Some Biased Estimators in Linear Regression,” Communications in Statistics—Theory and Methods, pp. 2389–2413, 2003. View at Publisher · View at Google Scholar
  20. G. C. McDonald and D. I. Galarneau, “A monte carlo evaluation of some ridge-type estimators,” Journal of the American Statistical Association, vol. 70, no. 350, pp. 407–416, 1975. View at Publisher · View at Google Scholar · View at Scopus