- Assistant Professor, Department of Statistics, Yuvaraja’s College, University of Mysore, Karnataka, India
In multiple linear regressions, if the data suffer from severe multicollinearity, then the ordinary least squares (OLS) method become more sensitive to it, and in such a case OLS could yield wrong sign for some of the regression coefficients. Therefore, when such a situation arises, we could use one of the biased regression methods viz., ridge regression, principal component regression, and so on, as an alternative method to OLS. This article pertains to ridge regression only. To overcome the problem of multicollinearity, here, we propose some modified ordinary ridge estimators, which are defined by taking convex combinations of some of the existing estimators. Empirically performance of the suggested estimators is compared with some of the existing estimators which are considered in this study, and the results indicate the suggested estimators performed better in terms of MSE (mean square errors). Moreover, the suggested estimators are more robust to problem of linear dependency between the predictors.
Keywords: Multiple linear regressions, multicollinearity, VIF, ridge parameter, MSE
[This article belongs to Research & Reviews : Journal of Statistics(rrjs)]
1. Horel, A. E. Applications of ridge analysis toregression problems. Chem. Eng. Progress. 58 (1962): 54–59.
2. Hoerl, Arthur E., and Robert W. Kennard. Ridge regression: applications to nonorthogonal problems. Technometrics 12.1 (1970): 69–82.
3. Hoerl, Arthur E., Robert W. Kannard, and Kent F. Baldwin. Ridge regression: some simulations. Communications in Statistics-Theory and Methods 4.2 (1975): 105–123.
4. Halawa, A. M., and M. Y. El Bassiouni. Tests of regression coefficients under ridge regression models. J. of Statistical Computation and Simulation 65.1-4 (2000): 341–356.
5. Hastie, T., and R. Tibshirani. Generalized Additive Models. Chapman Hall & CRC. Monographs on Statistics & Applied Probability. Chapman and Hall/CRC 1 (1990).
6. Alkhamisi, Mahdi A., and Ghazi Shukur. A Monte Carlo study of recent ridge parameters. Communications in Statistics—Simulation and Computation 36.3 (2007): 535–547.
7. Dorugade, A. V., and D. N. Kashid. Alternative method for choosing ridge parameter for regression. Applied Mathematical Sciences 4.9 (2010): 447–456.
8. Dorugade, Ashok Vithoba. New ridge parameters for ridge regression. J. of the Association of Arab Universities for Basic and Applied Sciences 15 (2014): 94–99.
9. Kibria, BM Golam. Performance of some new ridge regression estimators. Communications in Statistics-Simulation and Computation 32.2 (2003): 419–435.
10. Khalaf, Ghadban. A proposed ridge parameter to improve the least square estimator. J. of Modern Applied Statistical Methods 11.2 (2012): 15.
11. Khalaf, Ghadban, and Ghazi Shukur. Choosing ridge parameter for regression problems. (2005): 1177–1182.
12. JF, Lawless. A simulation study of ridge and other regression estimators. Communications in Statistics-theory and Methods 5.4 (1976): 307–323.
13. Mardikyan, Sona, and Eyüp Çetin. Efficient choice of biasing constant for ridge regression. Int. J. Contemp. Math. Sciences 3.11 (2008): 527–536.
14. Muniz, Gisela, and BM Golam Kibria. On some ridge regression estimators: An empirical comparison. Communications in Statistics—Simulation and Computation 38.3 (2009): 621–630.
15. Nomura, Masuo. On the almost unbiased ridge regression estimator. Communications in Statistics-Simulation and Computation 17.3 (1988): 729–743.
16. Vinod, Hrishikesh D., and Aman Ullah. Recent advances in regression methods. (1981).
17. Al-Hassan, Yazid M. Performance of a new ridge regression estimator. J of the Association of Arab Universities for Basic and Applied Sciences 9.1 (2010): 23–26.
18. Bhuyan, K. C. Multivariate analysis and its applications. New Central Book Agency, 2005.
19. Alheety, M. I., and BM Golam Kibria. On the Liu and almost unbiased Liu estimators in the presence of multicollinearity with heteroscedastic or correlated errors. Surveys in Mathematics and its Applications 4 (2009): 155–167.
20. Lee, Tak H., et al. Effect of dietary enrichment with eicosapentaenoic and docosahexaenoic acids on in vitro neutrophil and monocyte leukotriene generation and neutrophil function. New England J of Medicine 312.19 (1985): 1217–1224.
21. Liu, Kejian. Using Liu-type estimator to combat collinearity. Communications in Statistics-Theory and Methods 32.5 (2003): 1009–1020.
22. Montgomery, Douglas C., Elizabeth A. Peck, and G. Geoffrey Vining. Introduction to linear regression analysis. John Wiley & Sons, 2021.
23. Rao, Calyampudi Radhakrishna, and Helge Toutenburg. Linear models. Linear models. Springer, New York, NY, 1995. 3–18.
24. Bhat, Satish, and R. Vidya. Improvement on Ridge Regression Estimator. Proc. of National Conference on ‘Statistical Methods and Data Analysis. 2015.
25. Bhat, Satish Shankar. A comparative study on the performance of new ridge estimators. Pakistan J of Statistics and Operation Research (2016): 317–325.
26. Bhat, S., and R. Vidya. A New Ridge Estimator and Its Performance. Proc. of the National Conference on Operations Management, Analytics and Statistical Methods. 2017.
27. Bhat, Satish, Contributions to the Study of Partial Least Squares Regression (PLSR), PhD. Thesis, submitted to University of Mysore, Mysuru, Karnataka, India 2017. [http://shodhganga.inflibnet.ac.in/handle/10603/4143].
28. Bhat, S. S., and R. Vidya. Performance of Ridge Estimators Based on Weighted Geometric Mean and Harmonic Mean. J. of Scientific Research, 12.1 (2020): 1–13.
29. Bhat, Satish. Performance of a Weighted Ridge Estimator. Int. J. Agricult. Stat. Sci. Vol 15.1 (2019): 347–354.
30. BHAT, SATISH. A New Modified Ridge Regression Estimator. (2021).
31. Thisted, Ronald Aaron. Ridge Regression, Minimax Estimation, and Empirical Bayes Methods. Stanford University, 1977.
|Received||August 22, 2022|
|Accepted||August 29, 2022|
|Published||September 12, 2022|