113
Views
5
CrossRef citations to date
0
Altmetric
Original Articles

General Sparse Boosting: Improving Feature Selection of L2 Boosting by Correlation-Based Penalty Family

Pages 1612-1640 | Received 09 Oct 2012, Accepted 09 Jul 2013, Published online: 10 Dec 2014

References

  • Breiman, L. (1998). Arcing classifiers (with discussion). The Annals of Statistics 26:801–849.
  • Breiman, L. (1999). Prediction games and arcing algorithms. Neural Computation 11:1493–1517.
  • Breiman, L. (1995). Better subset regression using the nonnegative garrote. Technometrics 37:373–384.
  • Breiman, L. Friedman, J. (1985). Estimating optimal transformations for multiple regression and correlation. Journal of the American Statistical Association 80:580–598.
  • Bühlmann, P. (2006). Boosting for high-dimensional linear models. The Annals of Statistics 34:559–583.
  • Bühlmann, P., Hothorn, T. (2010). Twin Boosting: Improved feature selection and prediction. Statistics and Computing 20(2):119–138.
  • Bühlmann, P. Hothorn, T. (2007). Boosting algorithms: Regularization, prediction and model fitting. Statistical Science 4(22):477–505.
  • Bühlmann, P., Yu, B. (2003). Boosting with the L2 loss: Regression and classification. Journal of the American Statistical Association 98:324–339.
  • Bühlmann, P., Yu, B. (2006). Sparse boosting. Journal of Machine Learning Research 7:1001–1024.
  • Bühlmann, P., Meier, L. (2008). Discussion of “One-step sparse estimates in nonconcave penalized likelihood models”The Annals of Statistics 36:1534–1541.
  • Culp, M., Michailidis, G., Johnson, K. (2011). On Adaptive Regularization Methods in Boosting. Journal of Computational and Graphical Statistics 20(4):937–955.
  • Donoho, D.L., Johnstone, I.M. (1994). Ideal spatial adaptation by wavelet shrinkage. Biometrika 81:425–455.
  • Efron, M., Hastie, T., Johnstone, I., Tibshirani, R. (2004). Least angle regression. The Annals of Statistics 32(2):407–499.
  • Fan, L., Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association 96:1348–1360.
  • Freund, Y., Schapire, R.E. (1996). Experiments with a new boosting algorithm. In:Proceedings of the Thirteenth International Conference: Machine Learning.San Francisco:Morgan Kaufman, pp. 148–156.
  • Friedman, J. (2001). Greedy function approximation: A gradient boosting machine. The Annals of Statistics 29:11891232.
  • Friedman, J., Hastie, T., Tibshirani, R. (2000). Additive logistic regression: A statistical view of boosting (with discussion). The Annals of Statistics 28:337–407.
  • Friedman, J., Hastie, T., Tibshirani, R. (2007). Pathwise coordinate optimization. The Annals of Applied Statistics 2(1):302–332.
  • Friedman, J., Stutzle, W. (1981). Projection pursuit regression. Journal of the American Statistical Association 76:817–823.
  • Frank, I.E., Friedman, J.H. (1993). A statistical view of some chemometrics regression tools. Technometrics 35:109–148.
  • Green, P., Silverman, B. (1994). Nonparametric Regression and Generalized Linear Models: A Roughness Penalty Approach. New York: Chapman Hall.
  • Hansen, M., Yu, B. (2001). Model selection and minimum description length principle. Journal of the American Statistical Association 96:746–774.
  • Harrison, D., Rubinfeld, D.L. (1978). Hedonic housing prices and the demand for clean air. Journal of Environmental Economics and Management 5:81–102.
  • Hurvich, C.M., Tsai, C.L. (1989). Regression and time series model selection in small samples. Biometrika 76:297–307.
  • Schmid, M., Hothorn, T. (2008). Boosting additive models using component-wise P-spline. Computational Statistics and Data Analysis 53:298–311.
  • Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society, Series B 58:267–288.
  • Weisberg, S. (2005). Applied Linear Regression 3rd ed.. New York: J. Wiley & Sons.
  • Zou, H. (2006). The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101:1418–1429.
  • Zou, H., Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society, Series B 67:301–320.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.