Estimators of Linear Regression Model and Prediction under Some Assumptions Violation

Affiliation(s)

Department of Statistics, Ladoke Akintola University of Technology, Ogbomoso, Nigeria.

Department of Statistics, University of Ibadan, Ibadan, Nigeria.

Department of Statistics, Ladoke Akintola University of Technology, Ogbomoso, Nigeria.

Department of Statistics, University of Ibadan, Ibadan, Nigeria.

ABSTRACT

The development of many estimators of parameters of linear regression model is traceable to non-validity of the assumptions under which the model is formulated, especially when applied to real life situation. This notwithstanding, regression analysis may aim at prediction. Consequently, this paper examines the performances of the Ordinary Least Square (OLS) estimator, Cochrane-Orcutt (COR) estimator, Maximum Likelihood (ML) estimator and the estimators based on Principal Component (PC) analysis in prediction of linear regression model under the joint violations of the assumption of non-stochastic regressors, independent regressors and error terms. With correlated stochastic normal variables as regressors and autocorrelated error terms, Monte-Carlo experiments were conducted and the study further identifies the best estimator that can be used for prediction purpose by adopting the goodness of fit statistics of the estimators. From the results, it is observed that the performances of COR at each level of correlation (multicollinearity) and that of ML, especially when the sample size is large, over the levels of autocorrelation have a convex-like pattern while that of OLS and PC are concave-like. Also, as the levels of multicollinearity increase, the estimators, except the PC estimators when multicollinearity is negative, rapidly perform better over the levels autocorrelation. The COR and ML estimators are generally best for prediction in the presence of multicollinearity and autocorrelated error terms. However, at low levels of autocorrelation, the OLS estimator is either best or competes consistently with the best estimator, while the PC estimator is either best or competes with the best when multicollinearity level is high（λ__>__0.8 or λ__<__-0.49）.

The development of many estimators of parameters of linear regression model is traceable to non-validity of the assumptions under which the model is formulated, especially when applied to real life situation. This notwithstanding, regression analysis may aim at prediction. Consequently, this paper examines the performances of the Ordinary Least Square (OLS) estimator, Cochrane-Orcutt (COR) estimator, Maximum Likelihood (ML) estimator and the estimators based on Principal Component (PC) analysis in prediction of linear regression model under the joint violations of the assumption of non-stochastic regressors, independent regressors and error terms. With correlated stochastic normal variables as regressors and autocorrelated error terms, Monte-Carlo experiments were conducted and the study further identifies the best estimator that can be used for prediction purpose by adopting the goodness of fit statistics of the estimators. From the results, it is observed that the performances of COR at each level of correlation (multicollinearity) and that of ML, especially when the sample size is large, over the levels of autocorrelation have a convex-like pattern while that of OLS and PC are concave-like. Also, as the levels of multicollinearity increase, the estimators, except the PC estimators when multicollinearity is negative, rapidly perform better over the levels autocorrelation. The COR and ML estimators are generally best for prediction in the presence of multicollinearity and autocorrelated error terms. However, at low levels of autocorrelation, the OLS estimator is either best or competes consistently with the best estimator, while the PC estimator is either best or competes with the best when multicollinearity level is high（λ

Cite this paper

K. Ayinde, E. Apata and O. Alaba, "Estimators of Linear Regression Model and Prediction under Some Assumptions Violation,"*Open Journal of Statistics*, Vol. 2 No. 5, 2012, pp. 534-546. doi: 10.4236/ojs.2012.25069.

K. Ayinde, E. Apata and O. Alaba, "Estimators of Linear Regression Model and Prediction under Some Assumptions Violation,"

References

[1] D. N. Gujarati, “Basic Econometric,” 4th Edition, Tata McGraw-Hill Publishing Company Limited, New Delhi and New York, 2005.

[2] J. Neter and W. Wasserman, “Applied Linear Model,” Richard D. Irwin, Inc, 1974.

[3] T. B. Formby, R. C. Hill and S. R. Johnson, “Advance Econometric Methods,” Springer-Verlag, New York, Berlin, Heidelberg, London, Paris and Tokyo, 1984.

[4] G. S. Maddala, “Introduction to Econometrics,” 3rd Edition, John Willey and Sons Limited, Hoboken, 2002.

[5] S. Chartterjee, A. S. Hadi and B. Price, “Regression by Example,” 3rd Edition, John Wiley and Sons, Hoboken, 2000.

[6] A. E. Hoerl, “Application of Ridge Analysis to Regression Problems,” Chemical Engineering Progress, Vol. 58, No. 3, 1962, pp. 54-59.

[7] A. E. Hoerl and R. W. Kennard “Ridge Regression Biased Estimation for Non-Orthogonal Problems,” Technometrics, Vol. 8, No. 1, 1970, pp. 27-51.

[8] W. F. Massy, “Principal Component Regression in Exploratory Statistical Research,” Journal of the American Statistical Association, Vol. 60, No. 309, 1965, pp. 234- 246.

[9] D. W. Marquardt, “Generalized Inverse, Ridge Regression, Biased Linear Estimation and Non–Linear Estimation,” Technometrics, Vol. 12, No. 3, 1970, pp. 591-612.

[10] M. E. Bock, T. A. Yancey and G. G. Judge, “The Statistical Consequences of Preliminary Test Estimators in Regression”, Journal of the American Statistical Association, Vol. 68, No. 341, 1973, pp. 109-116.

[11] T. Naes and H. Marten, “Principal Component Regression in NIR Analysis: View Points, Background Details Selection of Components,” Journal of Chemometrics, Vol. 2, No. 2, 1988, pp. 155-167.

[12] I. S. Helland “On the Structure of Partial Least Squares Regression,” Communication Is Statistics, Simulations and Computations, Vol. 17, No. 2, 1988, pp. 581-607.

[13] I. S. Helland “Partial Least Squares Regression and Statistical Methods,” Scandinavian Journal of Statistics, Vol. 17, No. 2, 1990, pp. 97-114.

[14] A. Phatak and S. D. Jony, “The Geometry of Partial Least Squares,” Journal of Chemometrics, Vol. 11, No. 4, 1997, pp. 311-338.

[15] A. C. Aitken, “On Least Square and Linear Combinations of Observations,” Proceedings of Royal Statistical Society, Edinburgh, Vol. 55, 1935, pp. 42-48.

[16] J. Johnston, “Econometric Methods,” 3rd Edition, McGraw Hill, New York, 1984.

[17] D. Cochrane and G. H. Orcutt, “Application of Least Square to Relationship Containing Autocor-Related Error Terms,” Journal of American Statistical Association, Vol. 44, No. 245, 1949, pp. 32-61.

[18] S. J. Paris and C. B. Winstein “Trend Estimators and Serial Correlation,” Unpublished Cowles Commission, Discussion Paper, Chicago, 1954.

[19] C. Hildreth and J. Y. Lu, “Demand Relationships with Autocorrelated Disturbances,” Michigan State University, East Lansing, 1960.

[20] J. Durbin, “Estimation of Parameters in Time Series Regression Models,” Journal of Royal Statistical Society B, Vol. 22, No. 1, 1960, pp. 139-153.

[21] H. Theil, “Principle of Econometrics,” John Willey and Sons, New York, 1971.

[22] C. M. Beach and J. S. Mackinnon, “A Maximum Likelihood Procedure Regression with Autocorrelated Errors,” Econometrica, Vol. 46, No. 1, 1978, pp. 51-57.

[23] D. L. Thornton, “The Appropriate Autocorrelation Transformation When Autocorrelation Process Has a Finite Past,” Federal Reserve Bank St. Louis, 1982, pp. 82-102.

[24] J. S. Chipman, “Efficiency of Least Squares Estimation of Linear Trend When Residuals Are Autocorrelated,” Econometrica, Vol. 47, No. 1, 1979, pp. 115-127.

[25] W. Kramer, “Finite Sample Efficiency of OLS in Linear Regression Model with Autocorrelated Errors,” Journal of American Statistical Association, Vol. 75, No. 372, 1980, pp. 1005-1054.

[26] C. Kleiber, “Finite Sample Efficiency of OLS in Linear Regression Model with Long Memory Disturbances,” Economic Letters, Vol. 72, No. 2, 2001, pp.131-136.

[27] J. O. Iyaniwura and J. C. Nwabueze, “Estimators of Linear Model with Autocorrelated Error Terms and Trended Independent Variable,” Journal of Nigeria Statistical Association, Vol. 17, 2004, pp. 20-28.

[28] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms When Independent Variable Is Normal,” Journal of Nigerian Association of Mathematical Physics, 2005a, Vol. 9, pp. 379-384.

[29] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms with Exponential Independent Variable,” Journal of Nigerian Association of Mathematical Physics, Vol. 9, 2005b, pp. 385-388.

[30] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms When the Independent Variable Is Autoregressive,” Global Journal of Pure and Applied Sciences, Vol. 11, 2005c, pp. 131- 135.

[31] K. Ayinde and R. A. Ipinyomi, “A Comparative Study of the OLS and Some GLS Estimators When Normally Distributed Regressors Are Stochastic,” Trend in Applied Sciences Research, Vol. 2, No. 4, 2007, pp. 354-359. doi:10.3923/tasr.2007.354.359

[32] P. Rao and Z. Griliches, “Small Sample Properties of Several Two-Stage Regression Methods in the Context of Autocorrelation Error,” Journal of American Statistical Association, Vol. 64, 1969, pp. 251-272.

[33] K. Ayinde and J. O. Iyaniwura, “A Comparative Study of the Performances of Some Estimators of Linear Model with Fixed and Stochastic Regressors,” Global Journal of Pure and Applied Sciences, Vo.14, No. 3, 2008, pp. 363- 369. doi:10.4314/gjpas.v14i3.16821

[34] K. Ayinde and B. A. Oyejola, “A Comparative Study of Performances of OLS and Some GLS Estimators When Stochastic Regressors Are Correlated with Error Terms,” Research Journal of Applied Sciences, Vol. 2, No. 3, 2007, pp. 215-220.

[35] K. Ayinde, “A Comparative Study of the Perfomances of the OLS and Some GLS Estimators When Stochastic Regressors Are both Collinear and Correlated with Error Terms,” Journal of Mathematics and Statistics, Vol. 3, No. 4, 2007a, pp. 196-200.

[36] K. Ayinde and J. O. Olaomi, “Performances of Some Estimators of Linear Model with Autocorrelated Error Terms when Regressors are Normally Distributed,” International Journal of Natural and Applied Sciences, Vol. 3, No. 1, 2007, pp. 22-28.

[37] K. Ayinde and J. O. Olaomi, “A Study of Robustness of Some Estimators of Linear Model with Autocorrelated Error Terms When Stochastic Regressors Are Normally Distributed,” Journal of Modern Applied Statistical Methods, Vol. 7 No. 1, 2008, pp. 246-252.

[38] K. Ayinde, “Performances of Some Estimators of Linear Model When Stochastic Regressors are Correlated with Autocorrelated Error Terms,” European Journal of Scientific Research, Vol. 20 No. 3, 2008, pp. 558-571.

[39] K. Ayinde, “Equations to Generate Normal Variates with Desired Intercorrelation Matrix,” International Journal of Statistics and System, Vol. 2, No. 2, 2007b, pp. 99-111.

[40] K. Ayinde and O. S. Adegboye, “Equations for Generating Normally Distributed Random Variables with Specified Intercorrelation,” Journal of Mathematical Sciences, Vol. 21, No. 2, 12010, pp. 83-203.

[41] TSP, “Users Guide and Reference Manual,” Time Series Processor, New York, 2005.

[42] E. O. Apata, “Estimators of Linear Regression Model with Autocorrelated Error term and Correlated Stochastic Normal Regressors,” Unpublished Master of Science Thesis, University of Ibadan, Ibadan, 2011.

[1] D. N. Gujarati, “Basic Econometric,” 4th Edition, Tata McGraw-Hill Publishing Company Limited, New Delhi and New York, 2005.

[2] J. Neter and W. Wasserman, “Applied Linear Model,” Richard D. Irwin, Inc, 1974.

[3] T. B. Formby, R. C. Hill and S. R. Johnson, “Advance Econometric Methods,” Springer-Verlag, New York, Berlin, Heidelberg, London, Paris and Tokyo, 1984.

[4] G. S. Maddala, “Introduction to Econometrics,” 3rd Edition, John Willey and Sons Limited, Hoboken, 2002.

[5] S. Chartterjee, A. S. Hadi and B. Price, “Regression by Example,” 3rd Edition, John Wiley and Sons, Hoboken, 2000.

[6] A. E. Hoerl, “Application of Ridge Analysis to Regression Problems,” Chemical Engineering Progress, Vol. 58, No. 3, 1962, pp. 54-59.

[7] A. E. Hoerl and R. W. Kennard “Ridge Regression Biased Estimation for Non-Orthogonal Problems,” Technometrics, Vol. 8, No. 1, 1970, pp. 27-51.

[8] W. F. Massy, “Principal Component Regression in Exploratory Statistical Research,” Journal of the American Statistical Association, Vol. 60, No. 309, 1965, pp. 234- 246.

[9] D. W. Marquardt, “Generalized Inverse, Ridge Regression, Biased Linear Estimation and Non–Linear Estimation,” Technometrics, Vol. 12, No. 3, 1970, pp. 591-612.

[10] M. E. Bock, T. A. Yancey and G. G. Judge, “The Statistical Consequences of Preliminary Test Estimators in Regression”, Journal of the American Statistical Association, Vol. 68, No. 341, 1973, pp. 109-116.

[11] T. Naes and H. Marten, “Principal Component Regression in NIR Analysis: View Points, Background Details Selection of Components,” Journal of Chemometrics, Vol. 2, No. 2, 1988, pp. 155-167.

[12] I. S. Helland “On the Structure of Partial Least Squares Regression,” Communication Is Statistics, Simulations and Computations, Vol. 17, No. 2, 1988, pp. 581-607.

[13] I. S. Helland “Partial Least Squares Regression and Statistical Methods,” Scandinavian Journal of Statistics, Vol. 17, No. 2, 1990, pp. 97-114.

[14] A. Phatak and S. D. Jony, “The Geometry of Partial Least Squares,” Journal of Chemometrics, Vol. 11, No. 4, 1997, pp. 311-338.

[15] A. C. Aitken, “On Least Square and Linear Combinations of Observations,” Proceedings of Royal Statistical Society, Edinburgh, Vol. 55, 1935, pp. 42-48.

[16] J. Johnston, “Econometric Methods,” 3rd Edition, McGraw Hill, New York, 1984.

[17] D. Cochrane and G. H. Orcutt, “Application of Least Square to Relationship Containing Autocor-Related Error Terms,” Journal of American Statistical Association, Vol. 44, No. 245, 1949, pp. 32-61.

[18] S. J. Paris and C. B. Winstein “Trend Estimators and Serial Correlation,” Unpublished Cowles Commission, Discussion Paper, Chicago, 1954.

[19] C. Hildreth and J. Y. Lu, “Demand Relationships with Autocorrelated Disturbances,” Michigan State University, East Lansing, 1960.

[20] J. Durbin, “Estimation of Parameters in Time Series Regression Models,” Journal of Royal Statistical Society B, Vol. 22, No. 1, 1960, pp. 139-153.

[21] H. Theil, “Principle of Econometrics,” John Willey and Sons, New York, 1971.

[22] C. M. Beach and J. S. Mackinnon, “A Maximum Likelihood Procedure Regression with Autocorrelated Errors,” Econometrica, Vol. 46, No. 1, 1978, pp. 51-57.

[23] D. L. Thornton, “The Appropriate Autocorrelation Transformation When Autocorrelation Process Has a Finite Past,” Federal Reserve Bank St. Louis, 1982, pp. 82-102.

[24] J. S. Chipman, “Efficiency of Least Squares Estimation of Linear Trend When Residuals Are Autocorrelated,” Econometrica, Vol. 47, No. 1, 1979, pp. 115-127.

[25] W. Kramer, “Finite Sample Efficiency of OLS in Linear Regression Model with Autocorrelated Errors,” Journal of American Statistical Association, Vol. 75, No. 372, 1980, pp. 1005-1054.

[26] C. Kleiber, “Finite Sample Efficiency of OLS in Linear Regression Model with Long Memory Disturbances,” Economic Letters, Vol. 72, No. 2, 2001, pp.131-136.

[27] J. O. Iyaniwura and J. C. Nwabueze, “Estimators of Linear Model with Autocorrelated Error Terms and Trended Independent Variable,” Journal of Nigeria Statistical Association, Vol. 17, 2004, pp. 20-28.

[28] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms When Independent Variable Is Normal,” Journal of Nigerian Association of Mathematical Physics, 2005a, Vol. 9, pp. 379-384.

[29] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms with Exponential Independent Variable,” Journal of Nigerian Association of Mathematical Physics, Vol. 9, 2005b, pp. 385-388.

[30] J. C. Nwabueze, “Performances of Estimators of Linear Model with Auto-Correlated Error Terms When the Independent Variable Is Autoregressive,” Global Journal of Pure and Applied Sciences, Vol. 11, 2005c, pp. 131- 135.

[31] K. Ayinde and R. A. Ipinyomi, “A Comparative Study of the OLS and Some GLS Estimators When Normally Distributed Regressors Are Stochastic,” Trend in Applied Sciences Research, Vol. 2, No. 4, 2007, pp. 354-359. doi:10.3923/tasr.2007.354.359

[32] P. Rao and Z. Griliches, “Small Sample Properties of Several Two-Stage Regression Methods in the Context of Autocorrelation Error,” Journal of American Statistical Association, Vol. 64, 1969, pp. 251-272.

[33] K. Ayinde and J. O. Iyaniwura, “A Comparative Study of the Performances of Some Estimators of Linear Model with Fixed and Stochastic Regressors,” Global Journal of Pure and Applied Sciences, Vo.14, No. 3, 2008, pp. 363- 369. doi:10.4314/gjpas.v14i3.16821

[34] K. Ayinde and B. A. Oyejola, “A Comparative Study of Performances of OLS and Some GLS Estimators When Stochastic Regressors Are Correlated with Error Terms,” Research Journal of Applied Sciences, Vol. 2, No. 3, 2007, pp. 215-220.

[35] K. Ayinde, “A Comparative Study of the Perfomances of the OLS and Some GLS Estimators When Stochastic Regressors Are both Collinear and Correlated with Error Terms,” Journal of Mathematics and Statistics, Vol. 3, No. 4, 2007a, pp. 196-200.

[36] K. Ayinde and J. O. Olaomi, “Performances of Some Estimators of Linear Model with Autocorrelated Error Terms when Regressors are Normally Distributed,” International Journal of Natural and Applied Sciences, Vol. 3, No. 1, 2007, pp. 22-28.

[37] K. Ayinde and J. O. Olaomi, “A Study of Robustness of Some Estimators of Linear Model with Autocorrelated Error Terms When Stochastic Regressors Are Normally Distributed,” Journal of Modern Applied Statistical Methods, Vol. 7 No. 1, 2008, pp. 246-252.

[38] K. Ayinde, “Performances of Some Estimators of Linear Model When Stochastic Regressors are Correlated with Autocorrelated Error Terms,” European Journal of Scientific Research, Vol. 20 No. 3, 2008, pp. 558-571.

[39] K. Ayinde, “Equations to Generate Normal Variates with Desired Intercorrelation Matrix,” International Journal of Statistics and System, Vol. 2, No. 2, 2007b, pp. 99-111.

[40] K. Ayinde and O. S. Adegboye, “Equations for Generating Normally Distributed Random Variables with Specified Intercorrelation,” Journal of Mathematical Sciences, Vol. 21, No. 2, 12010, pp. 83-203.

[41] TSP, “Users Guide and Reference Manual,” Time Series Processor, New York, 2005.

[42] E. O. Apata, “Estimators of Linear Regression Model with Autocorrelated Error term and Correlated Stochastic Normal Regressors,” Unpublished Master of Science Thesis, University of Ibadan, Ibadan, 2011.