References
- Breiman, L. 1996. Bagging predictors. Machine Learning 24 (2):123–40. doi:https://doi.org/10.1007/BF00058655.
- Breiman, L. 1999. Prediction games and arcing algorithms. Neural Computation 11 (7):1493–517.
- Breiman, L. 2001. Random forest. Machine Learning 45 (1):5–32. doi:https://doi.org/10.1023/A:1010933404324.
- Bühlmann, P. 2002. Consistency for L2 boosting and matching pursuit with trees and tree-type basis functions. Technical report, ETH Zürich.
- Bühlmann, P., and B. Yu. 2003. Boosting with the L2 loss: Regression and classification. Journal of the American Statistical Association 98 (462):324–39. doi:https://doi.org/10.1198/016214503000125.
- Chen, T., and C. Guestrin. 2016. XGBoost: A scalable tree boosting system. Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD2016), 785–94.
- Freund, Y., and R. Schapire. 1996. Experiments with a new boosting algorithm. Machine Learning: Proceedings of the Thirteenth International Conference, Morgan Kauffman, San Francisco, 148–56.
- Friedman, J. 1991. Multivariate adaptive regression splines. The Annals of Statistics 19 (1):1–67. doi:https://doi.org/10.1214/aos/1176347963.
- Friedman, J. 2001. Greedy function approximation: A gradient boosting machine. The Annals of Statistics 29 (5):1189–232. doi:https://doi.org/10.1214/aos/1013203451.
- Friedman, J. 2002. Stochastic gradient boosting. Computational Statistics & Data Analysis 38:367–78. doi:https://doi.org/10.1016/S0167-9473(01)00065-2.
- Friedman, J., T. Hastie, and R. Tibshirani. 2000. Additive logistic regression: A statistical view of boosting. The Annals of Statistics 28 (2):337–407. doi:https://doi.org/10.1214/aos/1016218223.
- Hastie, T., R. Tibshirani, and J. Friedman. 2009. Elements of statistical learning: Data mining, inference and prediction. 2nd ed. New York, NY: Springer Verlag.
- Lazarevic, A., and Z. Obradovic. 2002. Boosting algorithms for parallel and distributed learning. Distributed and Parallel Databases 11 (2):203–29. doi:https://doi.org/10.1023/A:1013992203485.
- Mallat, S. G., and Z. Zhang. 1993. Matching pursuits with time-frequency dictionaries. IEEE Transactions on Signal Processing 41 (12):3397–415.
- Mason, L., J. Baxter, P. Bartlett, and M. Frean. 1999. Boosting algorithms as gradient descent. In Neural information processing systems. Vol. 12, 512–18. Cambridge, MA: MIT Press.
- Schapire, R. 1990. The strength of weak learnability. Machine Learning 5 (2):197–227. doi:https://doi.org/10.1007/BF00116037.
- Tyree, S., K. Weinberger, K. Agrawal, and K. Paykin. 2011. Parallel boosted regression trees for web search ranking. Proceedings of the 20th International Conference on World Wide Web, 387–96.
- Ye, J., J. H. Chow, J. Chen, and Z. Zheng. 2009. Stochastic gradient boosted distributed decision trees. Proceedings of the 18th ACM Conference on Information and Knowledge Management (CIKM 09), 2061–4.