REFERENCES
- M. Fernandez-Delgado, E. Cernadas, S. Barro, and D. Amorim, “Do we need hundreds of classifiers to solve real world classification problems?,” J. Mach. Learn. Res., Vol. 15, pp. 3133–81, Jan. 2014.
- O. Sagi, and L. Rokach, “Explainable decision forest: transforming a decision forest into an interpretable tree,” Inf. Fusion, Vol. 61, pp. 124–38, Sept. 2020.
- L. I. Kuncheva. Combining pattern classifiers: methods and algorithms. John Wiley & Sons, 2014.
- M. Woznaik, M. Grana, and E. Corchado, “A survey of multiple classifier systems as hybrid systems,” Inf. Fusion, Vol. 16, pp. 3–17, Mar. 2014.
- L. Rokach, “Decision forest: twenty years of research,” Inf. Fusion, Vol. 27, pp. 111–25, Jan. 2016.
- A. B. Shaikh, and S. Srinivasan, “A brief survey on random forest ensembles in classification model,” in International Conference on Innovative Computing and Communications, Springer, Singapore, 2019, pp. 253–60.
- A. Criminisi, J. Shotton, and E. Konukoglu, ““Decision forests for classification, regression, density estimation, manifold learning and semi-supervised learning,” Microsoft Research Cambridge, Tech. Rep. MSRTR-2011-114, Vol. 5, no. 6, pp. 12, Oct. 2011.
- T. Lan, H. Hu, C. Jiang, G. Yang, and Z. Zhao, “A comparative study of decision tree, random forest, and convolutional neural network for spread-F identification,” Adv. Space Res., Vol. 65, no. 8, pp. 2052–61, Apr.2020.
- X. Xia, T. Lin, and Z. Chen, “Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning,” Applied Intelligence, Vol. 48, no. 9, pp. 2568–79, Sept. 2018.
- L. Breiman, “Bagging predictors,” Mach. Intell., Vol. 20, no. 8, pp. 123–40, 1996.
- T. H. Lee, A. Ullah, and R. Wang, “Bootstrap aggregating and random forest,” in Macroeconomic Forecasting in the Era of Big Data, Cham: Springer, 2020, pp. 389–429.
- A. J. Sage, U. Genschel, and D. Nettleton, “Tree aggregation for random forest class probability estimation,” Statistical Analysis and Data Mining: The ASA Data Science Journal, Vol. 13, no. 2, pp. 134–50, Apr. 2020.
- Y. Tian, and Y. Feng, “RaSE: random subspace ensemble classification,” J. Mach. Learn. Res., Vol. 22, no. 45, pp. 1–93, Jan. 2021.
- S. Talukdar, K. U. Eibek, S. Akhter, S. Ziaul, A. R. M. T. Islam, and J. Mallick, “Modeling fragmentation probability of land-use and land-cover using the bagging, random forest and random subspace in the Teesta River Basin, Bangladesh,” Ecol. Indic., Vol. 126, pp. 107612, Jul 2021.
- J. K. Jaiswal, and R. Samikannu. “Application of random forest algorithm on feature subset selection and classification and regression,” in World Congress on Computing and Communication Technologies (WCCCT). IEEE, Feb. 2017, pp. 65-68.
- Y. Liu, and H. Wu. “Prediction of road traffic congestion based on random forest,” in 10th International Symposium on Computational Intelligence and Design (ISCID), Vol. 2, IEEE, Dec. 2017, pp. 361-364.
- A. Gonzalez-Vidal, F. Jimenez, and A. F. Gomez-Skarmeta, “A methodology for energy multivariate time series forecasting in smart buildings based on feature selection,” Energy. Build., Vol. 196, pp. 71–82, Aug. 2019.
- R. Katuwal, P. N. Suganthan, and L. Zhang, “Heterogeneous oblique random forest,” Pattern Recognit., Vol. 99, pp. 107078, Mar. 2020.
- M. Kumar, M. K. Jindal, R. K. Sharma, and S. R. Jindal, “Performance evaluation of classifiers for the recognition of offline handwritten Gurmukhi characters and numerals: a study,” Artif. Intell. Rev., Vol. 53, no. 3, pp. 2075–97, Mar. 2020.
- S. Dargan, M. Kumar, M. Ayyagari, and G. Kumar, “A survey of deep learning and its applications: a new paradigm to machine learning,” Archives of Computational Methods in Engineering, Vol. 27, pp. 1071–92, Sept. 2020.
- I. R. Parray, S. S. Khurana, M. kumar, and A. A. Altalbe, “Time series data analysis of stock price movement using machine learning techniques,” Soft. comput., Vol. 24, no. 21, pp. 16509–17, Apr. 2020.
- M. Waqar, H. Dawood, P. Guo, M. B. Shahnawaz, and M. A. Ghazanfar. “Prediction of stock market by principal component analysis,” in 13th International Conference on Computational Intelligence and Security (CIS). IEEE, Dec. 2017, pp. 599-602.
- R. M. Nabi, M. S. Soran Ab, and H. Harron, “A novel approach for stock price prediction using gradient boosting machine with feature engineering (GBM-wFE),” Kurdistan Journal of Applied Research, Vol. 5, no. 1, pp. 28–48, Apr. 2020.
- A. Nahil, and A. Lyhyaoui, “Short-term stock price forecasting using kernel principal component analysis and support vector machines: the case of Casablanca stock exchange,” Procedia of Computer Science, Vol. 127, pp. 161–9, Jan. 2018.
- B. Weng, M. A. Ahmed, and F. M. Megahed, “Stock market one-day ahead movement prediction using disparate data sources,” Expert. Syst. Appl., Vol. 79, pp. 153–63, Aug. 2017.
- S. Choudhary, and S. Singhal, “International linkages of Indian equity market: evidence from panel co-integration approach,” Journal of Asset Management, Vol. 21, pp. 333–41, Jul. 2020.
- L. Khaidem, S. Saha, and S. R. Dey, “Predicting the direction of stock market prices using random forest,” ArXiv Preprint ArXiv:1605.00003, pp. 1–20, 2016.