1,605
Views
2
CrossRef citations to date
0
Altmetric
Research Article

Feature Selection for Supervised Learning and Compression

, , , &
Article: 2034293 | Received 27 May 2021, Accepted 18 Jan 2022, Published online: 06 Mar 2022

References

  • Aboudi, N., and L. Benhlima. 2016. Review on wrapper feature selection approaches. In 2016 International Conference on Engineering MIS, Agadir, Morocco, 1–2375. September.
  • Addison, P. S. 2017. The illustrated wavelet transform handbook: Introductory theory and applications in, In science, engineering, medicine and finance, Boca Raton, CRC press.
  • Ahmed, N., T. Natarajan, and K. R. Rao. 1974. Discrete Cosine Transform. IEEE Transactions on Computers C-23 1 (January):90–93. doi:10.1109/T-C.1974.223784.
  • Alelyani, S., J. Tang, and H. Liu. 2013. Feature selection for clustering: A review. In Charu C. Aggarwal, Chandan K. Reddy editors, Data Clustering, 29–60. New York; Chapman/Hall/CRC.
  • Bailey, D., and P. Swarztrauber. 1994. A fast method for the numerical evaluation of continuous Fourier and Laplace transforms. SIAM Journal on Scientific Computing 15 (5):1105–10. doi:10.1137/0915067.
  • Bolón-Canedo, V., I. Porto-Díaz, N. Sánchez-Maroño, and A. Alonso-Betanzos. 2014. A framework for cost-based feature selection. Pattern Recognition 47 (7):2481–89. doi:10.1016/j.patcog.2014.01.008.
  • Breiman, L. 2017. Classification and regression trees. New York: Routledge.
  • Buchbinder, N., M. Feldman, J. Seffi, and R. Schwartz. 2015. A tight linear time (1/2)- approximation for unconstrained submodular maximization. SIAM Journal on Computing 44 (5):1384–402. doi:10.1137/130929205.
  • Chandrashekar, G., and F. Sahin. 2014. A survey on feature selection methods. Computers & Electrical Engineering 40 (1):16–28. doi:10.1016/j.compeleceng.2013.11.024.
  • Chuang, L.-Y., H.-W. Chang, T. Chung-Jui, and C.-H. Yang. 2008. Improved binary PSO for feature selection using gene expression data. Computational Biology and Chemistry 32 (1):29–38. doi:10.1016/j.compbiolchem.2007.09.005.
  • Cox, D. 1958. The regression analysis of binary sequences. Journal of the Royal Statistical Society: Series B (Methodological) 20 (2):215–32.
  • Das, A., and D. Kempe. 2011. Submodular meets spectral: Greedy algorithms for subset selection, sparse approximation and dictionary selection. In Proceedings of the 28th, International Conference on International Conference on Machine Learning. Omnipress, Bellevue, Washington, USA, 1057–64.
  • Das, A., and D. Kempe. 2018. Approximate submodularity and its applications: Subset selection, sparse approximation and dictionary selection. The Journal of Machine Learning Research 19 (1):74–107.
  • Deutsch, P. 1996. DEFLATE Compressed Data Format Specification version 1.3. RFC 1951. Technical report. Internet Engineering Task Force, May.
  • Ding, C., and H. Peng. 2005. Minimum redundancy feature selection from microarray gene expression data. Journal of Bioinformatics and Computational Biology 3 (2):185–205. doi:10.1142/S0219720005001004.
  • Feige, U., V. S. Mirrokni, and J. Vondrak. 2011. Maximizing non-monotone submodular functions. SIAM Journal on Computing 40 (4):1133–53. doi:10.1137/090779346.
  • Fürnkranz, J. 2002. Round robin classification. Journal of Machine Learning Resea Rch 2 (Mar):721–47.
  • Guerrero-Ibáñez, J., S. Zeadally, and J. Contreras-Castillo. 2018. Sensor technologies for intelligent transportation systems. Sensors 18 (4):1212. doi:10.3390/s18041212.
  • Guyon, I., and A. Elisseeff. 2003. An introduction to variable and feature selection. Journal of Machine Learning Research 3 (Mar):1157–82.
  • Hancer, E., B. Xue, and M. Zhang. 2018. Differential evolution for filter feature selection based on information theory and feature ranking. Knowledge-Based Systems 140:103–19. doi:10.1016/j.knosys.2017.10.028.
  • Harding, J., G. Powell, R. Yoon, J. Fikentscher, C. Doyle, D. Sade, M. Lukuc, J. Simons, and J. Wang. 2014. Vehicle-to-vehicle communications: Readiness of V2V technology for application. Report DOT HS 812 014. Washington, DC: National Highway Traffic Safety Administration.
  • Herman, G., B. Zhang, Y. Wang, Y. Getian, and F. Chen. 2013. Mutual information- based method for selecting informative feature sets. Pattern Recognition 46 (12):3315–27. doi:10.1016/j.patcog.2013.04.021.
  • Hinsbergh, V., N. G. James, P. Taylor, A. Thomason, X. Zhou, and A. Mouzakitis. 2018. vehicle point of interest detection using in-car data. In Proceedings of the 2nd ACM SIGSPATIAL International Workshop on AI for Geographic Knowledge Discovery, 1–4. GeoAI’18. Seattle, WA, USA: ACM.
  • Huffman, D. A. 1952. A method for the construction of minimum-redundancy codes. Proceedings of the IRE 40 (9):1098–101. doi:10.1109/JRPROC.1952.273898.
  • Jain, D., and V. Singh. 2018. Feature selection and classification systems for chronic disease prediction: A review. Egyptian Informatics Journal 19 (3):179–89. doi:10.1016/j.eij.2018.03.002.
  • Jennifer, D., and C. Brodley. 2004. Feature selection for unsupervised learning. Journal of Machine Learning Research 5 (Aug):845–89.
  • Jiang, L., G. Kong, and C. Li. 2021. Wrapper Framework for Test-Cost-Sensitive Feature Selection. IEEE Transactions on Systems, Man, and Cybernetics: Systems 51 (3):1747–56.
  • Kennedy, J., and R. Eberhart. 1995. Particle swarm optimization. In Proceedings of ICNN’95 - Int e rnational Confe rence on Neural Networks, Perth, WA, Australia, 4:1942–48.
  • Khanna, R., E. Elenberg, A. Dimakis, S. Negahban, and J. Ghosh. 2017. Scalable greedy feature selection via weak submodularity. Artificial Intelligence and Statistics, 54, 1560–68.
  • Kohavi, R., and G. John. 1997. Wrappers for feature subset selection. Artificial Intelligence 97 (1–2):273–324. doi:10.1016/S0004-3702(97)00043-X.
  • Kumar, V., and S. Minz. 2014. Feature Selection: A literature review. Smart Computing R Eview 4 (3):211–29.
  • Lal, T., O. Chapelle, J. Weston, and A. Elisseeff. 2006. Embedded methods. In Isabelle Guyon, Masoud Nikravesh, Steve Gunn, Lotfi A. Zadeh editors, Feature extraction, 137–65. Berlin Heidelberg: Springer.
  • Leskovec, J., A. Krause, C. Guestrin, C. Faloutsos, J. VanBriesen, and N. Glance. 2007. Cost-effective outbreak detection in networks. In Proceedings of the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Jose, California, USA, 420–29. ACM.
  • May, R., G. Dandy, and H. Maier. 2011. Review of input variable selection methods for artificial neural networks, vol. 10, 16004. Rijeka: InTech Europe.
  • Minoux, M. 1978. Accelerated greedy algorithms for maximizing submodular set functions. In Optimization Techniques, isbn: 9783, isbn: 9783540358909, ed. 540358909, ed. J. Stoer, 234–43. Berlin, Heidelberg: Springer Berlin Heidelberg.
  • Mirzasoleiman, B., A. Badanidiyuru, A. Karbasi, J. Vondrák, and A. Krause. 2015. Lazier than lazy greedy. In Proceedings of the 29th AAAI Conference on Artificial Int elli g ence, Austin, Texas, 1812–18. AAAI’15. AAAI Press.
  • Mitra, P., C. A. Murthy, and K. P. Sankar. 2002. Unsupervised feature selection using feature similarity. IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (3):301–12. doi:10.1109/34.990133.
  • Nagpal, A., and V. Singh. 2018. A feature selection algorithm based on qualitative mutual information for cancer microarray data. International Conference on Computational Intelligence and Data Science, Procedia Computer Science, Elsevier 132:244–52. https://doi.org/10.1016/j.procs.2018.05.195
  • Nemhauser, G., L. Wolsey, and M. Fisher. 1978. An analysis of approximations for maximizing submodular set functions—I. Mathematical Programming 14 (1):265–94. doi:10.1007/BF01588971.
  • Papa, J. P., G. H. Rosa, A. N. de Souza, and L. C. S. Afonso. 2018. Feature selection through binary brain storm optimization. Computers & Electrical Engineering 72:468–81. doi:10.1016/j.compeleceng.2018.10.013.
  • Peng, H., F. Long, and C. Ding. 2005. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence 27 (8):1226–38. doi:10.1109/TPAMI.2005.159.
  • Salomon, D., and G. Motta. 2010. Handbook of data compression. Springer Science & Business Media, London.
  • Taylor, P., N. Griffiths, and A. Bhalerao. 2015. Redundant feature selection using permutation methods. Automatic Machine Learning Workshop, Lille, France, 1–8.
  • Taylor, P., N. Griffiths, and A. Mouzakitis. 2018. Selection of compressible signals from telemetry data. Mining Urban Data Workshop, London.
  • Weiß, C. 2011. V2X communication in Europe – From research projects towards standardization and field testing of vehicle communication technology. Computer Networks 55 (14):3103–19. doi:10.1016/j.comnet.2011.03.016.
  • Witten, I. H., E. Frank, M. A. Hall, and C. J. Pal. 2016. Data Mining: Practical machine learning tools and techniques. Morgan Kaufmann, Burlington, MA.
  • Zhang, Y., S. Cheng, Y. Shi, D.-W. Gong, and X. Zhao. 2019. Cost-sensitive feature selection using two-archive multi-objective artificial bee colony algorithm. Expert Systems with Applications 137:46–58. doi:10.1016/j.eswa.2019.06.044.
  • Zhang, Y., D.-W. Gong, X.-Z. Gao, T. Tian, and X.-Y. Sun. 2020. Binary differential evolution with self-learning for multi-objective feature selection. Information Sciences 507:67–85. doi:10.1016/j.ins.2019.08.040.
  • Ziv, J., and A. Lempel. 1978. Compression of individual sequences via variable-rate coding. IEEE Transactions onInformation Theory 5. 24 (September):530–36. doi:10.1109/TIT.1978.1055934.