Abstract
A mean squared error criterion is used to compare five estimators of the coefficients in a linear regression model: least squares, principal components, ridge regression, latent root, and a shrunken estimator. Each of the biased estimators is shown to offer improvement in mean squared error over least squares for a wide range of choices of the parameters of the model. The results of a simulation involving all five estimators indicate that the principal components and latent root estimators perform best overall, but the ridge regression estimator has the potential of a smaller mean squared error than either of these.