Abstract
We consider a two-stage estimation method for linear regression. First, it uses the lasso in Tibshirani to screen variables and, second, re-estimates the coefficients using the least-squares boosting method in Friedman on every set of selected variables. Based on the large-scale simulation experiment in Hastie, Tibshirani, and Tibshirani, lassoed boosting performs as well as the relaxed lasso in Meinshausen and, under certain scenarios, can yield a sparser model. Applied to predicting equity returns, lassoed boosting gives the smallest mean-squared prediction error compared to several other methods.
Acknowledgments
I am grateful to two referees for their insightful comments and suggestions. The computation and research support from the Office of Research and Coles College of Business at Kennesaw State University are greatly acknowledged.