1,128
Views
3
CrossRef citations to date
0
Altmetric
Articles

Machine learning methods based on probabilistic decision tree under the multi-valued preference environment

ORCID Icon, , &
Pages 38-59 | Received 30 Oct 2020, Accepted 10 Jan 2021, Published online: 01 Feb 2021

References

  • Barlow, H. B. (1989). Unsupervised learning. Neural Computation, 1(3), 295–311. https://doi.org/10.1162/neco.1989.1.3.295
  • Bhargava, D. N., Sharma, G., Bhargava, R., & Mathuria, M. (2013). Decision tree analysis on j48 algorithm for data mining. Proceedings of International Journal of Advanced Research in Computer Science and Software Engineering, 3(6), 22–45.
  • Blum, A. L., & Langley, P. (1997). Selection of relevant features and examples in machine learning. Artificial Intelligence, 97(1–2), 245–271. https://doi.org/10.1016/S0004-3702(97)00063-5
  • Capó, M., Pérez, A., & Lozano, J. A. (2020). An efficient K-means clustering algorithm for tall data. Data Mining and Knowledge Discovery, 34(3), 776–811. https://doi.org/10.1007/s10618-020-00678-9
  • Chen, B., Gu, W., & Hu, J. (2010). An improved multi-label classification method and its application to functional genomics. International Journal of Computational Biology and Drug Design, 3(2), 133–145. https://doi.org/10.1504/IJCBDD.2010.035239
  • Chen, Y.-L., Hsu, C.-L., & Chou, S.-C. (2003). Construction a multi-valued and multi-label decision tree. Expert Systems with Applications, 25(2), 199–209. https://doi.org/10.1016/S0957-4174(03)00047-2
  • Cheng, J. (2014). A study of improving the performance of mining multi-valued and multi-labeled data. Informatica, 25(1), 95–111.
  • Chou, S., & Hsu, C. L. (2005). MMDT: A multi-valued and multi-labeled decision tree classifier for data mining. Expert Systems with Applications, 28(4), 799–812. https://doi.org/10.1016/j.eswa.2004.12.035
  • Clarke, K. R. (1993). Non‐parametric multivariate analyses of changes in community structure. Austral Ecology, 18(1), 117–143. https://doi.org/10.1111/j.1442-9993.1993.tb00438.x
  • Crawford, S. L. (1989). Extension to the CART algorithm. International Journal of Man-Machine Studies, 31(2), 197–217. https://doi.org/10.1016/0020-7373(89)90027-8
  • Figueiredo, M. A. T. (2003). Adaptive sparseness for supervised learning. IEEE Transactions on Pattern Analysis and Machine Intelligence, 25(9), 1150–1159. https://doi.org/10.1109/TPAMI.2003.1227989
  • Galton, F. (1889). Co-relations and their measurement, chiefly from anthropometric data. Proceedings of the Royal Society of London, 45, 135–145.
  • Gerfo, L. L., Rosasco, L., Odone, F., Vito, E. D., & Verri, A. (2008). Spectral Algorithms for Supervised Learning. Neural Computation, 20(7), 1873–1897. https://doi.org/10.1162/neco.2008.05-07-517
  • Goldsmith, J. (2001). Unsupervised learning of the morphology of a natural language. Computational Linguistics, 27(2), 153–198. https://doi.org/10.1162/089120101750300490
  • Han, J. W., & Micheline, K. (2006). Data mining: Concepts and techniques. In Data mining concepts models methods & algorithms (2nd ed., Vol. 5, pp. 1–18). Morgan Kaufmann.
  • Han, Q. Q., Liu, J. M., Shen, Z. W., Liu, J. W., & Gong, F. K. (2020). Vector partitioning quantization utilizing K-means clustering for physical layer secret key generation. Information Sciences, 512, 137–160. https://doi.org/10.1016/j.ins.2019.09.076
  • Hardikar, S., Shrivastava, A., & Choudary, V. (2012). Comparison between ID3 and C4.5. International Journal of Computer Science, 2(7), 34–39.
  • Hsu, C. L. (2021). A multi-valued and sequential-labeled decision tree method for recommending sequential patterns in cold-start situations. Applied Intelligence, 51(1), 506–526. https://doi.org/10.1007/s10489-020-01806-0
  • Hunt, E. (1965). Selection and reception conditions in grammar and concept learning. Journal of Verbal Learning and Verbal Behavior, 4(3), 211–215. https://doi.org/10.1016/S0022-5371(65)80022-6
  • Kruppa, J., Schwarz, A., Arminger, G., & Ziegler, A. (2013). Consumer credit risk: Individual probability estimates using machine learning. Expert Systems with Applications, 40(13), 5125–5131. https://doi.org/10.1016/j.eswa.2013.03.019
  • Lee, T. R., Wood, W. T., & Phrampus, B. J. (2019). A machine learning (KNN) approach to predicting global seafloor total organic carbon. Global Biogeochemical Cycles, 33(1), 37–46. https://doi.org/10.1029/2018GB005992
  • Mantas, C. J., Castellano, J. G., Moral, G. S., & Abellán, J. (2019). A comparison of random forest based on algorithms: Random credal random forest versus oblique random forest. Soft Computing, 23(21), 10739–10754. https://doi.org/10.1007/s00500-018-3628-5
  • Miao, D. Q., & Wang, J. (1997). Rough sets based on approach for multivariate decision tree construction. Journal of Software, 8(6), 425–431.
  • Narasimha, P. L. V., & Naidu, M. M. (2014). CC-SLIQ: Performance enhancement with 2k split points in SLIQ decision tree algorithm. IAENG International Journal of Computer Science, 41(3), 163–173.
  • Pal, M., & Mather, P. M. (2003). An assessment of the effectiveness of decision tree methods for land cover classification. Remote Sensing of Environment, 86(4), 554–565. https://doi.org/10.1016/S0034-4257(03)00132-9
  • Pal, R., Kupka, K., Aneja, A. P., & Militky, J. (2016). Business health characterization: A hybrid regression and support vector machine analysis. Expert Systems with Applications, 49, 48–59. https://doi.org/10.1016/j.eswa.2015.11.027
  • Polat, K., & Güneş, S. (2007). Classification of epileptiform EEG using a hybrid system based on decision tree classifier and fast Fourier transform. Applied Mathematics and Computation, 187(2), 1017–1026. https://doi.org/10.1016/j.amc.2006.09.022
  • Qayyum, A., Usama, M., Qadir, J., & Al-Fuqaha, A. (2020). Securing future autonomous & connected vehicles: Challenges posed by adversarial machine learning and the way forward. IEEE Communications Surveys & Tutorials, 22(2), 998–1026.
  • Quinlan, J. R. (1986). Induction of decision trees. Machine Learning, 1(1), 81–106. https://doi.org/10.1007/BF00116251
  • Quinlan, J. R. (1987). Simplifying decision trees. International Journal of Man-Machine Studies, 27(3), 221–234. https://doi.org/10.1016/S0020-7373(87)80053-6
  • Quinlan, J. R. (1996). Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research, 4(1), 77–90. https://doi.org/10.1613/jair.279
  • Ruggieri, S. (2002). Efficient C4.5 classification algorithm. IEEE Transactions on Knowledge and Data Engineering, 14(2), 438–444. https://doi.org/10.1109/69.991727
  • Rutkowski, L., Jaworski, M., Pietruczuk, L., & Duda, P. (2014). The CART decision tree for mining data streams. Information Sciences, 266(1), 1–15. https://doi.org/10.1016/j.ins.2013.12.060
  • Santhosh, K. (2013). Modified C4.5 algorithm with improved information entropy. International Journal of Engineering Research & Technology, 2(14), 485–512.
  • Santini, S., & Jain, R. (1999). Similarity measures. IEEE Transactions on Pattern Analysis & Machine Intelligence, 21(9), 871–883.
  • Sun, Z. L., Choi, T. M., Au, K. F., & Yu, Y. (2008). Sales forecasting using extreme learning machine with applications in fashion retailing. Decision Support Systems, 46(1), 411–419. https://doi.org/10.1016/j.dss.2008.07.009
  • Tsoumakas, G. (2019). A survey of machine learning techniques for food sales prediction. Artificial Intelligence Review, 52(1), 441–447. https://doi.org/10.1007/s10462-018-9637-z
  • Tversky, A. (1977). Features of similarity. Psychological Review, 84(4), 327–352. https://doi.org/10.1037/0033-295X.84.4.327
  • Utkin, L. V. (2019). An imprecise extension of SVM-based machine learning models. Neurocomputing, 331, 18–32.
  • Wong, W. K., & Guo, Z. X. (2010). A hybrid intelligent model for medium-term sales forecasting in fashion retail supply chains using extreme learning machine and harmony search algorithm. International Journal of Production Economics, 128(2), 614–624. https://doi.org/10.1016/j.ijpe.2010.07.008
  • Yi, W., Lu, M., & Liu, Z. (2011). Multi-valued attribute and multi-labeled data decision tree algorithm. Pattern Recognition & Artificial Intelligence, 2(2), 67–74.
  • Yuan, R. X., Li, Z., Guan, X. H., & Xu, L. (2010). An SVM-based machine learning method for accurate internet traffic classification. Information Systems Frontiers, 12(2), 149–156. https://doi.org/10.1007/s10796-008-9131-2
  • Zhang, M. L., & Zhou, Z. H. (2007). ML-KNN: A lazy learning approach to multi-label learning. Pattern Recognition, 40(7), 2038–2048. https://doi.org/10.1016/j.patcog.2006.12.019
  • Zhao, R., & Li, H. (2007). Algorithm of multi-valued attribute and multi-labeled data decision tree. Computer Engineering, 33(13), 87–89.