Abstract
Ridge regression has been suggested as a technique which can circumvent difficulties associated with ordinary least squares (OLS) regression. In particular, ridge regression was developed in order to reduce the mean square error of the parameter estimates. It is shown that ridge regression also offers an advantage over OLS estimation when a validity shrinkage criterion is considered. Systematic comparisons of cross-validated multiple correlations indicate that ridge estimation is superior to OLS estimation when (1) the predictors are multicollinear, (2) the number of predictors is large relative to the sample size, and (3) the population multiple correlation is relatively small in magnitude.
Get full access to this article
View all access options for this article.
