169
Views
4
CrossRef citations to date
0
Altmetric
Original Articles

An enhanced random forest with canonical partial least squares for classification

, &
Pages 4324-4334 | Received 02 Oct 2018, Accepted 09 Jan 2020, Published online: 27 Jan 2020

References

  • Benavoli, A., G. Corani, and F. Mangili. 2016. Should we really use post-hoc tests based on mean-ranks? Journal of Machine Learning Research 17:1–10.
  • Blaser, R., and P. Fryzlewicz. 2016. Random rotation ensembles. Journal of Machine Learning Research 17:1–26.
  • Bock, K. W. D., and D. V. D. Poel. 2011. An empirical evaluation of rotation-based ensemble classifiers for customer churn prediction. Expert Systems with Applications 38:12293–301. doi:10.1016/j.eswa.2011.04.007.
  • Breiman, L. 2001. Random forest. Machine Learning 45 (1):5–32. doi:10.1023/A:1010933404324.
  • Breiman, L., J. Friedman, C. J. Stone, R. A. Olshen, and S. J. Charles. 1984. Classification and regression trees. Correspondence analysis. New York: Academic Press.
  • Chang, C. C., and C. J. Lin. 2011. Libsvm: A library for support vector machines. ACM Transactions on Intelligent Systems and Technology 2 (3):1–27. doi:10.1145/1961189.1961199.
  • Chun, H., and S. Keleş. 2010. Sparse partial least squares regression for simultaneous dimension reduction and variable selection. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 72 (1):3–25. doi:10.1111/j.1467-9868.2009.00723.x.
  • Delgado, M. F., E. Cernadas, S. Barro, and D. Amorim. 2014. Do we need hundreds of classifiers to solve real world classification problems? Journal of Machine Learning Research 15:3133–81.
  • Demšar, J. 2006. Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7:1–30.
  • Dietterich, T. 1998. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Computation 10 (7):1895–924. doi:10.1162/089976698300017197.
  • Ding, B. Y., and R. Gentleman. 2005. Classification using penalized partial least squares. Journal of Computational and Graphical Statistics 14 (2):280–98. doi:10.1198/106186005X47697.
  • Fan, J., F. Han, and L. Han. 2014. Challenges of big data analysis. National Science Review 1 (2):293–314. doi:10.1093/nsr/nwt032.
  • Friedman, J. 1989. Regularized discriminant analysis. Journal of the American Statistical Association 84 (405):165–75. doi:10.1080/01621459.1989.10478752.
  • Hall, P., and K.-C. Li. 1993. On almost linearity of low dimensional projections from high dimensional data. The Annals of Statistics 21 (2):867–89. doi:10.1214/aos/1176349155.
  • Hotelling, H. 1933. Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology 24 (6):417–41. doi:10.1037/h0071325.
  • Indahl, U. G., M. Harald, and N. Tormod. 2007. From dummy regression to prior probabilities in PLS-DA. Journal of Chemometrics 21 (12):529–36. doi:10.1002/cem.1061.
  • Indahl, U. G., K. H. Liland, and N. Tormod. 2009. Canonical partial least squares a unified PLS approach to classification and regression problems. Journal of Chemometrics 23 (9):495–504. doi:10.1002/cem.1243.
  • Jong, S. D. 1993. SIMPLS: An alternative approach to partial least squares regression. Chemometrics and Intelligent Laboratory Systems 42:251–63.
  • Li, K. C. 1991. Sliced inverse regression for dimension reduction. Journal of the American Statistical Association 86 (414):316–27. doi:10.1080/01621459.1991.10475035.
  • Liaw, A., and M. Wiener. 2002. Classification and regression by randomForest. R News 2:18–22.
  • Margineantu, D. D., and T. G. Dietterich. 1997. Pruning adaptive boosting. International Workshop Conference on Machine Learning, San Francisco, CA, Morgan Kaufmann. Citeseer, 211–8.
  • McCullagh, P. 1984. Generalized linear models. European Journal of Operational Research 16 (3):285–92. doi:10.1016/0377-2217(84)90282-0.
  • Nguyen, D. V., and D. M. Rocke. 2002a. Tumor classification by partial least squares using microarray gene expression data. Bioinformatics 18 (1):39–50. doi:10.1093/bioinformatics/18.1.39.
  • Nguyen, D. V., and D. M. Rocke. 2002b. Multi-class cancer classification via partial least squares with gene expression profiles. Bioinformatics 18 (9):1216–26. doi:10.1093/bioinformatics/18.9.1216.
  • Quinlan, J. R. 1986. Induction of decision trees. Machine Learning 1 (1):81–106. doi:10.1007/BF00116251.
  • Rodríguez, J. J., and L. I. Kuncheva. 2006. Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28:1619–30. doi:10.1109/TPAMI.2006.211.
  • Vapnik, V., and A. Chervonenkis. 1964. A note on one class of perceptrons. Automation and Remote Control 24:774C780.
  • Vong, R., P. Geladi, S. Wold, and K. Esbensen. 1988. Source contributions to ambient aerosol calculated by discriminant partial least squares regression. Journal of Chemometrics 2 (4):281–96. doi:10.1002/cem.1180020406.
  • Witten, D. M., and R. Tibshirani. 2011. Penalized classification using Fishers linear discriminant. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 73 (5):753–72. doi:10.1111/j.1467-9868.2011.00783.x.
  • Wold, H. 1966. Estimation of principal components and related models by iterative least squares. Multivariate Analysis 15:391–420.
  • Xu, Q. S., and Y. Z. Liang. 2001. Monte Carlo cross validation. Chemometrics and Intelligent Laboratory Systems 56 (1):1–11. doi:10.1016/S0169-7439(00)00122-2.
  • Zhang, L., and P. N. Suganthan. 2014. Random forests with ensemble of feature spaces. Pattern Recognition 47 (10):3429–37. doi:10.1016/j.patcog.2014.04.001.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.