OVERCOMING MULTICOLLINEARITY
THROUGH THE RIDGE REGRESSION METHOD
THROUGH THE RIDGE REGRESSION METHOD
Introduction
Regression analysis is one statistical method often used to determine the extent of dependency or the relationship of a dependent variable with one or more independent variables. When the analysis involves only one independent variable, then the analysis used is a simple linear regression analysis. Meanwhile, when the analysis involves two or more independent variables, the analysis used were multiple linear analysis.
One way to obtain the regression coefficients in multiple linear regression equation is the least squares method. This method produces the best estimator (no bias and minimum variance). However, if there are symptoms of multicollinearity then produce biased estimators are not consistent, but inefficient, so the variance of the regression coefficient becomes minimum. If that happens, one way to overcome these problems is through the ridge regression method. Basically this method is also the least squares method. The difference is the method of ridge regression, the independent variables are first transformed by centering and rescaling procedure. Then on the main diagonal correlation matrix of independent variables are added ridge parameter teta whose value is between 0 and 1.
Ridge regression method can be used with the assumption that the correlation matrix of independent variables have an inverse matrix. As a result of the regression coefficients and the alleged value of the dependent variable is easy to obtain. The alleged value of the dependent variable is determined by the size of the ridge parameter teta.
[next..............please wait]
No comments:
Post a Comment