References
- V. Aksakalli and M. Malekipirbazari, Feature selection via binary simultaneous perturbation stochastic approximation, Pattern. Recognit. Lett. 75 (2016), pp. 41–47. doi: 10.1016/j.patrec.2016.03.002
- J. Cai, J. Luo, S. Wang, and S. Yang, Feature selection in machine learning: A new perspective, Neurocomputing 300 (2018), pp. 70–79. doi: 10.1016/j.neucom.2017.11.077
- H. Cui, R. Li, and W. Zhong, Model-free feature screening for ultrahigh dimensional discriminant analysis, J. Am. Stat. Assoc. 110 (2015), pp. 630–641. doi: 10.1080/01621459.2014.920256
- P.A. Estévez, M. Tesmer, C.A. Perez, and J.M. Zurada, Normalized mutual information feature selection, IEEE Trans. Neural Networks 20 (2009), pp. 189–201. doi: 10.1109/TNN.2008.2005601
- W. Gao, L. Hu, and P. Zhang, Class-specific mutual information variation for feature selection, Pattern. Recognit. 79 (2018), pp. 328–339. doi: 10.1016/j.patcog.2018.02.020
- S. García, J. Luengo, and F. Herrera, Data Preprocessing in Data Mining, Springer, Cham, 2015.
- S. Happy, R. Mohanty, and A. Routray, An effective feature selection method based on pair-wise feature proximity for high dimensional low sample size data, 2017 25th European Signal Processing Conference (EUSIPCO), IEEE, 2017, pp. 1574–1578.
- M.M. Kabir, M.M. Islam, and K. Murase, A new wrapper feature selection approach using neural network, Neurocomputing 73 (2010), pp. 3273–3283. doi: 10.1016/j.neucom.2010.04.003
- M. R. Kosorok, Correction: discussion of Brownian distance covariance, Ann. Appl. Stat. 7 (2013), pp. 1247–1247. doi: 10.1214/13-AOAS636
- Y. Li, T. Li, and H. Liu, Recent advances in feature selection and its applications, Knowl. Inf. Syst.53 (2017), pp. 551–577. doi: 10.1007/s10115-017-1059-8
- R. Li, W. Zhong, and L. Zhu, Feature screening via distance correlation learning, J. Am. Stat. Assoc.107 (2012), pp. 1129–1139. doi: 10.1080/01621459.2012.695654
- H. Peng, F. Long, and C. Ding, Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy, IEEE Trans. Pattern. Anal. Mach. Intell. 27 (2005), pp. 1226–1238. doi: 10.1109/TPAMI.2005.159
- M.L. Rizzo and G.J. Székely, Energy distance, Wiley Interdisc. Rev. Comput. Stat. 8 (2016), pp. 27–38. doi: 10.1002/wics.1375
- W. Sheng and X. Yin, Sufficient dimension reduction via distance covariance, J. Comput. Graph. Stat.25 (2016), pp. 91–104. doi: 10.1080/10618600.2015.1026601
- G.J. Székely, M.L. Rizzo, and N.K. Bakirov, Measuring and testing dependence by correlation of distances, Ann. Stat. 35 (2007), pp. 2769–2794. doi: 10.1214/009053607000000505
- R. Tibshirani, Regression shrinkage and selection via the lasso: a retrospective, J. Royal Stat. Soc. B Stat. Methodol. 73 (2011), pp. 273–282. doi: 10.1111/j.1467-9868.2011.00771.x
- P. Vepakomma, C. Tonde, and A. Elgammal, Supervised dimensionality reduction via distance correlation maximization, Electron. J. Stat. 12 (2018), pp. 960–984. doi: 10.1214/18-EJS1403
- J.R. Vergara and P.A. Estévez, A review of feature selection methods based on mutual information, Neural Comput. Appl. 24 (2014), pp. 175–186. doi: 10.1007/s00521-013-1368-0
- D. Wang, Z. Zhang, R. Bai, and Y. Mao, A hybrid system with filter approach and multiple population genetic algorithm for feature selection in credit scoring, J. Comput. Appl. Math. 329 (2018), pp. 307–321. doi: 10.1016/j.cam.2017.04.036
- C.D. Yenigün and M.L. Rizzo, Variable selection in regression using maximal correlation and distance correlation, J. Stat. Comput. Simul. 85 (2015), pp. 1692–1705. doi: 10.1080/00949655.2014.895354
- L. Yu and H. Liu, Efficient feature selection via analysis of relevance and redundancy, J. Mach. Learn. Res. 5 (2004), pp. 1205–1224.
- J. Zhao, L. Chen, W. Pedrycz, and W. Wang, Variational inference-based automatic relevance determination kernel for embedded feature selection of noisy industrial data, IEEE Trans. Indust. Electron. 66 (2018), pp. 416–428.