603
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Feature Selection via Pareto Multi-objective Genetic Algorithms

ORCID Icon, ORCID Icon & ORCID Icon

References

  • Aha, D., and D. Kibler. 1991. Instance-based learning algorithms. Machine Learning 6:37–66. doi:10.1007/BF00153759.
  • Arauzo-Azofra, A., J. M. Benitez, and J. L. Castro. 2008. Consistency measures for feature selection. Journal of Intelligent Information Systems 30 (3):273–92. doi:10.1007/s10844-007-0037-0.
  • Asuncion, A., and D. J. Newman. 2007. UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences. Accessed June 1, 2010. http://www.ics.uci.edu/˜mlearn/MLRepository.html.
  • Bache, K., and M. Lichman. 2013. UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences. Accessed June 1, 2014. http://archive.ics.uci.edu/ml.
  • Batista, G. E. A. P. A., X. Wang, and E. J. Keogh. 2011. A complexity-invariant distance measure for time series. In SIAM International Conference on Data Mining, 699–710. Mesa, United States: SIAM.
  • Bleuler, S., M. Laumanns, L. Thiele, and E. Zitzler. 2003. PISA — A platform and programming language independent interface for search algorithms. In Evolutionary multi-criterion optimization, ed C. M. Fonseca, P. J. Fleming, E. Zitzler, L. Thiele, and K. Deb, 494–508. Berlin: Springer Berlin Heidelberg.
  • Deb, K., S. Agrawal, A. Pratap, and T. Meyarivan. 2000. A fast elitist nondominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In Parallel problem solving from nature, ed M. Schoenauer, K. Deb, G. Rudolph, X. Yao, E. Lutton, J. Merelo, and H. Schwefel, 849–58. Berlin: Springer Berlin Heidelberg.
  • Dwork, C., R. Kumar, M. Naor, and D. Sivakumar. 2001. Rank aggregation methods for the Web. In International Conference on World Wide Web, 613–22. Hong Kong, China: ACM.
  • Freitas, A. A. 2004. A critical review of multi-objective optimization in data mining: A position paper. SIGKDD Explorations Newsletter 6 (2):77–86. doi:10.1145/1046456.1046467.
  • Guyon, I., and A. Elisseeff. 2003. An introduction to variable and feature selection. Journal of Machine Learning Research 3:1157–82.
  • Hall, M. A. 1999. Correlation-based Feature Selection for Machine Learning. PhD Thesis, University of Waikato - New Zealand.
  • Hall, M. A.–. 2000. Correlation-based feature selection for discrete and numeric class machine learning. In International Conference on Machine Learning, 359–66. Stanford: Morgan Kaufmann.
  • Han, J., and M. Kamber. 2011. Data mining: Concepts and techniques, 3rd ed. San Francisco: Morgan Kaufmann.
  • He, X., D. Cai, and P. Niyogi. 2005. Laplacian score for feature selection. In Advances in Neural Information Processing Systems, 507–14. Cambridge, United States: MIT Press.
  • Kitchenham, B. A., and S. Charters. 2007. Guidelines for performing systematic literature reviews in software engineering. Technical report. Evidencebased Software Engineering - United Kingdom.
  • Kohavi, R., and G. H. John. 1997. Wrappers for feature subset selection. Artificial Intelligence 97 (1–2):273–324. doi:10.1016/S0004-3702(97)00043-X.
  • Lee, H. D., M. C. Monard, and F. C. Wu. 2006. A simple evaluation model for feature subset selection algorithms. Inteligência Artificial 10 (32):9–17.
  • Liu, C., and H. Shum. 2003. Kullback-Leibler boosting. In Computer Vision and Pattern Recognition, 587–94. Madison, United States: IEEE.
  • Liu, H., and H. Motoda. 1998. Feature selection for knowledge discovery and data mining. Norwell: Kluwer Academic Publishers.
  • Liu, H., and H. Motoda. 2007. Computational methods of feature selection. Boca Raton: Chapman & Hall/CRC.
  • Liu, H., and R. Setiono. 1996. A probabilistic approach to feature selection - a filter solution. In International Conference on Machine Learning, 319–27. Bari, Italy: Morgan Kaufmann.
  • Liu, H., and L. Yu. 2002. Feature selection for data mining. Arizona State University, Ira A. Fulton Schools of Engineering. Accessed June 1, 2010. http://www.public.asu.edu/˜huanliu/sur-fs02.ps.
  • Mccallum, A., and K. Nigam. 1998. A comparison of event models for naïve bayes text classification. In Workshop on Learning for Text Categorization, 41–48. Madison, United States: AAAI Press.
  • Mitra, P., C. A. Murthy, and S. K. Pal. 2002. Unsupervised feature selection using feature similarity. IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (3):301–12. doi:10.1109/34.990133.
  • Nahook, H. N., and M. Eftekhari. 2013. A feature selection method based on ∩ - fuzzy similarity measures using multi objective genetic algorithm. International Journal of Soft Computing and Engineering 3 (2):37–41.
  • Prati, R. C. 2012. Combining feature ranking algorithms through rank aggregation. In International Joint Conference on Neural Networks, 1–8. Brisbane, Australia: IEEE.
  • Press, W. H., S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery. 1992. Numerical recipes in C: The art of scientific computing. Cambridge: Cambridge University Press.
  • Robnik-Sikonja, M., and I. Kononenko. 2003. Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning 53 (1–2):23–69. doi:10.1023/A:1025667309714.
  • Santana, L. E. A., L. Silva, and A. M. P. Canuto. 2009. Feature selection in heterogeneous structure of ensembles: A genetic algorithm approach. In International Joint Conference on Neural Networks, 1491–98. Atlanta, United States: IEEE.
  • Saroj, J. 2014. Multi-objective genetic algorithm approach to feature subset optimization. In IEEE International Advance Computing Conference, 544–48. Gurgaon, India: IEEE.
  • Scholkopf, B., and A. J. Smola. 2001. Learning with Kernels: Support vector machines, regularization, optimization, and beyond. Cambridge, United States: MIT Press.
  • Spolaôr, N., A. C. Lorena, and H. D. Lee. 2010a. A systematic review of applications of multiobjective metaheuristics in feature selection (in Portuguese). Technical report. NOTE: An English version of the technical report can be requested to the authors. Federal University of ABC - Brazil.
  • Spolaôr, N., A. C. Lorena, and H. D. Lee. 2010b. Use of multiobjective genetic algorithms in feature selection. In IEEE Brazilian Symposium on Artificial Neural Network, 146–51. São Bernardo do Campo, Brazil: IEEE.
  • Spolaôr, N., A. C. Lorena, and H. D. Lee. 2011a. Multi-objective genetic algorithm evaluation in feature selection. In Evolutionary multi-criterion optimization, ed R. Takahashi, K. Deb, E. Wanner, and S. Greco, 462–76. Berlin: Springer Berlin Heidelberg.
  • Spolaôr, N., A. C. Lorena, and H. D. Lee. 2011b. Multiobjective genetic algorithms for feature selection (in Portuguese). In Encontro Nacional de Inteligência Artificial, 938–49. Natal, Brazil: Brazilian Computer Society.
  • Wang, C., and Y. Huang. 2009. Evolutionary-based feature selection approaches with new criteria for data mining: A case study of credit approval data. Expert Systems with Applications 36 (3):5900–08. doi:10.1016/j.eswa.2008.07.026.
  • Wilson, D. R., and T. R. Martinez. 1997. Improved heterogeneous distance functions. Journal of Artificial Intelligence Research 6:1–34.
  • Witten, I. H., and E. Frank. 2011. Data mining: Practical machine learning tools and techniques, 3rd ed. San Francisco: Morgan Kaufmann.
  • Xue, B., L. Cervante, L. Shang, W. N. Browne, and M. Zhang. 2013. Multi objective evolutionary algorithms for filter based feature selection in classification. International Journal on Artificial Intelligence Tools 22 (4):1350024–1–1350024–31. doi:10.1142/S0218213013500243.
  • Yang, J., and V. Honavar. 1998. Feature subset selection using a genetic algorithm. IEEE Intelligent Systems and Their Applications 13 (2):44–49. doi:10.1109/5254.671091.
  • Zaharie, D., S. Holban, D. Lungeanu, and D. Navolan. 2007. A computational intelligence approach for ranking risk factors in preterm birth. In International Symposium on Applied Computational Intelligence and Informatics, 135–40. Timisoara, Romania: IEEE.
  • Zeleny, M. 1973. An introduction to multiobjetive optimization. In Multiple criteria decision making, ed. J. L. Cochrane, and M. Zeleny, 262–301. Columbia, United States: University of South Carolina Press.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.