4,689
Views
4
CrossRef citations to date
0
Altmetric
Original Articles

Very Fast C4.5 Decision Tree Algorithm

ORCID Icon, &

References

  • Agrawal, G. L., and H. Gupta. 2013. Optimization of C4. 5 Decision Tree Algorithm for Data Mining Application. International Journal of Emerging Technology and Advanced Engineering 3 (3):341–45.
  • Behera, H. S., and D. P. Mohapatra. Eds.., 2015. Computational Intelligence in Data Mining Volume 1: Proceedings of the International Conference on CIDM. 5-6 December 2015 Vol. 410. Springer.
  • Breiman, L., J. Friedman, C. J. Stone, and R. A. Olshen. 1984. Classification and regression trees. CRC press.
  • Chong, W. M., C. L. Goh, Y. T. Bau, and K. C. Lee. 2014. Advanced Applied Informatics (IIAIAAI), 2014 IIAI 3rd International Conference on, 930–35. IEEE.
  • Dai, W., and W. Ji. 2014. A mapreduce implementation of C4. 5 Decision Tree Algorithm. International Journal of Database Theory and Application 7 (1):49–60.
  • Garca Laencina, P. J., P. H. Abreu, M. H. Abreu, and N. Afonoso. 2015. Missing data imputation on the 5-year survival prediction of breast cancer patients with unknown discrete values. Computers in Biology and Medicine 59:125–33.
  • Garcia, S., J. Luengo, J. A. Sez, V. Lopez, and F. Herrera. 2013. A survey of discretization techniques: Taxonomy and empirical analysis in supervised learning. IEEE Transactions on Knowledge and Data Engineering 25 (4):734–50.
  • Grzymala-Busse, J. W., and T. Mroczek 2015. A comparison of two approaches to discretization: Multiple scanning and C4. 5. In International Conference on Pattern Recognition and Machine Intelligence (pp. 44–53). Springer International Publishing.
  • Hothorn, T., K. Hornik, and A. Zeileis. 2006. Unbiased recursive partitioning: A conditional inference framework. Journal of Computational and Graphical Statistics 15 (3):651–74.
  • Hssina, B., A. Merbouha, H. Ezzikouri, and M. Erritali. 2014. A comparative study of decision tree ID3 and C4. 5. International Journal of Advanced Computer Science and Applications 4 (2).
  • Ibarguren, I., J. M. Prez, and J. Muguerza 2015. CTCHAID: Extending the Application of the Consolidation Methodology. In Portuguese Conference on Artificial Intelligence (pp. 572–77). Springer International Publishing.
  • Kass, G. V. 1980. An exploratory technique for investigating large quantities of categorical data. In Applied statistics, 119–27.
  • Kotsiantis, S. B. 2013. Decision trees: A recent overview. Artificial Intelligence Review 39 (4):261–83.
  • Lewis, M. 2012. Applied statistics for economists. Routledge.
  • Lim, T. S., W. Y. Loh, and Y. S. Shih. 2000. A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms. Machine Learning 40 (3):203–28.
  • Liu, H., and A. Gegov. 2016. Induction of modular classification rules by information entropy based rule generation. In Innovative Issues in Intelligent Systems, 217–30. Springer International Publishing.
  • Liu, H., and R. Setiono 1995. Chi2: Feature selection and discretization of numeric attributes. In Tools with artificial intelligence, proceedings., seventh international conference on (pp. 388–91). IEEE.
  • Lu, Z., X. Wu, and J. C. Bongard. 2015. Active learning through adaptive heterogeneous ensembling. IEEE Transactions on Knowledge and Data Engineering 27 (2):368–81.
  • Mu, Y., X. Liu, Z. Yang, and X. Liu. 2017. A parallel C4. 5 decision tree algorithm based on MapReduce. Concurrency and Computation. Practice and Experience 29:8.
  • Ooi, S. Y., S. C. Tan, and W. P. Cheah. 2016. Temporal sampling forest: An ensemble temporal learner. In Soft Computing, 1–14.
  • Pandya, R., and J. Pandya. 2015. C5.0 algorithm to improved decision tree with feature selection and reduced error pruning. International Journal of Computer Applications 117 (16).
  • Patel, N., and D. Singh. 2015. An Algorithm to Construct Decision Tree for Machine Learning based on Similarity Factor. International Journal of Computer Applications 111:10.
  • Perner, P. 2015. Decision tree induction methods and their application to big data. In Modeling and Processing for Next-Generation Big-Data Technologies (pp. 57–88). Springer International Publishing.
  • Quinlan, J. R. 1986. Induction of decision trees. Machine Learning 1:81–106.
  • Quinlan, J. R. 1996. Improved use of continuous attributes in C4. 5. Journal of Artificial Intelligence Research 4:77–90.
  • Quinlan, J. R. 2014. C4. 5: Programs for machine learning. Elsevier.
  • Reimann, C., P. Filzmoser, and R. G. Garrett. 2005. Background and threshold: Critical comparison of methods of determination. Science of the Total Environment 346 (1):1–16.
  • Rubin, A. 2012. Statistics for evidence-based practice and evaluation. Cengage Learning.
  • Saqib, F., A. Dutta, J. Plusquellic, P. Ortiz, and M. S. Pattichis. 2015. Pipelined decision tree classification accelerator implementation in FPGA (DT-CAIF). IEEE Transactions on Computers 64 (1):280–85.
  • Sharma, J. K. 2012. Business statistics. Pearson Education India.
  • Sugiyama, M., and K. M. Borgwardt. 2017. Significant Pattern Mining on Continuous Variables. arXiv preprint arXiv:1702.08694.
  • Sumam, M. I., E. M. Sudheep, and A. Joseph. 2013. A Novel Decision Tree Algorithm for Numeric Datasets-C 4.5* Stat.
  • UCI, UCI Machine Learning Repository, https://archive.ics.uci.edu/ml/datasets.html (2015).
  • Wang, R., S. Kwong, X. Z. Wang, and Q. Jiang. 2015. Segment based decision tree induction with continuous valued attributes. IEEE Transactions on Cybernetics 45 (7):1262–75.
  • Wilcoxon, F. 1992. Breakthroughs in Statistics. In Individual comparisons by ranking methods, 196–202. Springer New York.
  • Witten, I. H., E. Frank, M. A. Hall, and C. J. Pal. 2016. Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann.
  • Wu, X., V. Kumar, J. R. Quinlan, J. Ghosh, Q. Yang, H. Motoda, … Z. H. Zhou. 2008. Top 10 algorithms in data mining. Knowledge and Information Systems 14 (1):1–37.
  • Yang, Y., and W. Chen. 2016. Taiga: Performance optimization of the C4. 5 Decision Tree Construction Algorithm. Tsinghua Science and Technology 21 (4):415–25.
  • Zhu, H., J. Zhai, S. Wang, and X. Wang. 2014. Monotonic decision tree for interval valued data. In International Conference on Machine Learning and Cybernetics, 231–40. Springer Berlin Heidelberg.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.