Abstract
Coefficients of ordinary least squares (OLS) multiple regression present the best linear combination of the independent variables for fitting a dependent variable. However, the OLS model had never been designed to produce meaningful individual predictors' coefficients, it only yields their best linear aggregate. In applied regression analysis it is often impossible to estimate an individual predictor's role in an OLS model judging by the regression coefficients. This stems from the effect of multicollinearity which inflates the coefficients in multiple regression and makes them of opposite sign to the paired relations of predictors with the dependent variable. This paper considers properties of the so-called Shapley value (SV) regression developed specifically for adjusting the regression coefficients with multicollinearity among the predictors and estimating their importance. Adjusted SV regression provides a meaningful and interpretable coefficient for each individual predictor. This technique has been implemented for a dozen years in hundreds of research projects, and we have found that the adjusted SV model allows for more adequate managerial decisions based on meaningful coefficients of regression.
Get full access to this article
View all access options for this article.
