1,650
Views
4
CrossRef citations to date
0
Altmetric
Articles

A conditional opposition-based particle swarm optimisation for feature selection

ORCID Icon, ORCID Icon & ORCID Icon
Pages 339-361 | Received 08 Jul 2020, Accepted 29 Oct 2021, Published online: 22 Nov 2021

References

  • Abd Elaziz, M., Oliva, D., & Xiong, S. (2017). An improved opposition-based sine cosine algorithm for global optimization. Expert Systems with Applications, 90, 484–500. https://doi.org/10.1016/j.eswa.2017.07.043
  • AbdEl-Fattah Sayed, S., Nabil, E., & Badr, A. (2016). A binary clonal flower pollination algorithm for feature selection. Pattern Recognition Letters, 77, 21–27. https://doi.org/10.1016/j.patrec.2016.03.014
  • Aghdam, M. H., Ghasem-Aghaee, N., & Basiri, M. E. (2009). Text feature selection using ant colony optimization. Expert Systems with Applications, 36(3, Part 2), 6843–6853. https://doi.org/10.1016/j.eswa.2008.08.022
  • Agrawal, P., Abutarboush, H. F., Ganesh, T., & Mohamed, A. W. (2021). Metaheuristic algorithms on feature selection: A survey of one decade of research (2009–2019). IEEE Access, 9, 26766–26791. https://doi.org/10.1109/ACCESS.2021.3056407
  • Aladeemy, M., Adwan, L., Booth, A., Khasawneh, M. T., & Poranki, S. (2020). New feature selection methods based on opposition-based learning and self-adaptive cohort intelligence for predicting patient no-shows. Applied Soft Computing, 86, 105866/1-19. https://doi.org/10.1016/j.asoc.2019.105866
  • Cheng, R., & Jin, Y. (2015a). A social learning particle swarm optimization algorithm for scalable optimization. Information Sciences, 291, 43–60. https://doi.org/10.1016/j.ins.2014.08.039
  • Cheng, R., & Jin, Y. (2015b). A competitive swarm optimizer for large scale optimization. IEEE Transactions on Cybernetics, 45(2), 191–204. https://doi.org/10.1109/TCYB.2014.2322602
  • Chuang, L.-Y., Chang, H.-W., Tu, C.-J., & Yang, C.-H. (2008). Improved binary PSO for feature selection using gene expression data. Computational Biology and Chemistry, 32(1), 29–38. https://doi.org/10.1016/j.compbiolchem.2007.09.005
  • Datasets | Feature Selection @ ASU. (n.d.). Retrieved October 3, 2019, from http://featureselection.asu.edu/datasets.php
  • Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary grey wolf optimization approaches for feature selection. Neurocomputing, 172, 371–381. https://doi.org/10.1016/j.neucom.2015.06.083
  • Ewees, A. A., Abd Elaziz, M., & Houssein, E. H. (2018). Improved grasshopper optimization algorithm using opposition-based learning. Expert Systems with Applications, 112, 156–172. https://doi.org/10.1016/j.eswa.2018.06.023
  • Faris, H., Heidari, A. A., Al-Zoubi, A. M., Mafarja, M., Aljarah, I., Eshtay, M., & Mirjalili, S. (2020). Time-varying hierarchical chains of salps with random weight networks for feature selection. Expert Systems with Applications, 140, 112898/1-17. https://doi.org/10.1016/j.eswa.2019.112898
  • Gokalp, O., Tasci, E., & Ugur, A. (2020). A novel wrapper feature selection algorithm based on iterated greedy metaheuristic for sentiment classification. Expert Systems with Applications, 146, 113176/1-10. https://doi.org/10.1016/j.eswa.2020.113176
  • Gou, J., Lei, Y.-X., Guo, W.-P., Wang, C., Cai, Y.-Q., & Luo, W. (2017). A novel improved particle swarm optimization algorithm based on individual difference evolution. Applied Soft Computing, 57, 468–481. https://doi.org/10.1016/j.asoc.2017.04.025
  • Gunasundari, S., Janakiraman, S., & Meenambal, S. (2016). Velocity bounded boolean particle swarm optimization for improved feature selection in liver and kidney disease diagnosis. Expert Systems with Applications, 56, 28–47. https://doi.org/10.1016/j.eswa.2016.02.042
  • Hancer, E. (2019). Fuzzy kernel feature selection with multi-objective differential evolution algorithm. Connection Science, 31(4), 323–341. https://doi.org/10.1080/09540091.2019.1639624
  • Holland, J. H. (1992). Genetic algorithms. Scientific American, 267(1), 66–72. https://doi.org/10.1038/scientificamerican0792-66
  • Huang, C.-L., & Wang, C.-J. (2006). A GA-based feature selection and parameters optimizationfor support vector machines. Expert Systems with Applications, 31(2), 231–240. https://doi.org/10.1016/j.eswa.2005.09.024
  • Ibrahim, R. A., Abd Elaziz, M., Ewees, A. A., El-Abd, M., & Lu, S. (2021). New feature selection paradigm based on hyper-heuristic technique. Applied Mathematical Modelling, 98, 14–37. https://doi.org/10.1016/j.apm.2021.04.018
  • Jabeen, H., Jalil, Z., & Baig, A. R. (2009). Opposition based initialization in particle swarm optimization (O-PSO). In Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking Papers (pp. 2047–2052). https://doi.org/10.1145/1570256.1570274
  • Jensi, R., & Jiji, G. W. (2016). An enhanced particle swarm optimization with levy flight for global optimization. Applied Soft Computing, 43, 248–261. https://doi.org/10.1016/j.asoc.2016.02.018
  • Ji, B., Lu, X., Sun, G., Zhang, W., Li, J., & Xiao, Y. (2020). Bio-inspired feature selection: An improved binary particle swarm optimization approach. IEEE Access, 8, 85989–86002. https://doi.org/10.1109/ACCESS.2020.2992752
  • Jiao, B., Lian, Z., & Gu, X. (2008). A dynamic inertia weight particle swarm optimization algorithm. Chaos, Solitons & Fractals, 37(3), 698–705. https://doi.org/10.1016/j.chaos.2006.09.063
  • Jude Hemanth, D., & Anitha, J. (2019). Modified genetic algorithm approaches for classification of abnormal magnetic resonance brain tumour images. Applied Soft Computing, 75, 21–28. https://doi.org/10.1016/j.asoc.2018.10.054
  • Kang, Q., Xiong, C., Zhou, M., & Meng, L. (2018). Opposition-Based hybrid strategy for particle swarm optimization in noisy environments. IEEE Access, 6, 21888–21900. https://doi.org/10.1109/ACCESS.2018.2809457
  • Karaboga, D., & Basturk, B. (2007). A powerful and efficient algorithm for numerical function optimization: Artificial bee colony (ABC) algorithm. Journal of Global Optimization, 39(3), 459–471. https://doi.org/10.1007/s10898-007-9149-x
  • Kashef, S., & Nezamabadi-pour, H. (2015). An advanced ACO algorithm for feature subset selection. Neurocomputing, 147, 271–279. https://doi.org/10.1016/j.neucom.2014.06.067
  • Kennedy, J. (2011). Particle swarm optimization. In laude Sammut & Geoffrey I. Webb (Eds.), Encyclopedia of machine learning (pp. 760–766). Springer. https://doi.org/10.1007/978-0-387-30164-8_630
  • Kennedy, J., & Eberhart, R. C. (1997). A discrete binary version of the particle swarm algorithm. In Computational Cybernetics and Simulation 1997 IEEE International Conference on Systems, Man, and Cybernetics (Vol. 5, pp. 4104–4108). https://doi.org/10.1109/ICSMC.1997.637339
  • Kumar, A., Kumar, Y., & Kukkar, A. (2020). A feature selection model for prediction of software defects. International Journal of Embedded Systems, 13(1), 28–39. https://doi.org/10.1504/IJES.2020.108279
  • Lin, K.-C., Hung, J. C., & Wei, J. (2018). Feature selection with modified lion’s algorithms and support vector machine for high-dimensional data. Applied Soft Computing, 68, 669–676. https://doi.org/10.1016/j.asoc.2018.01.011
  • Lu, Y., Liang, M., Ye, Z., & Cao, L. (2015). Improved particle swarm optimization algorithm and its application in text feature selection. Applied Soft Computing, 35, 629–636. https://doi.org/10.1016/j.asoc.2015.07.005
  • Mafarja, M. M., & Mirjalili, S. (2017). Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing, 260, 302–312. https://doi.org/10.1016/j.neucom.2017.04.053
  • Mazaheri, V., & Khodadadi, H. (2020). Heart arrhythmia diagnosis based on the combination of morphological, frequency and nonlinear features of ECG signals and metaheuristic feature selection algorithm. Expert Systems with Applications, 161, 113697/1-14. https://doi.org/10.1016/j.eswa.2020.113697
  • Mirjalili, S. (2015). Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowledge-Based Systems, 89, 228–249. https://doi.org/10.1016/j.knosys.2015.07.006
  • Mirjalili, S., Gandomi, A. H., Mirjalili, S. Z., Saremi, S., Faris, H., & Mirjalili, S. M. (2017). Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Advances in Engineering Software, 114, 163–191. https://doi.org/10.1016/j.advengsoft.2017.07.002
  • Mirjalili, S., Mirjalili, S. M., & Hatamlou, A. (2016). Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Computing and Applications, 27(2), 495–513. https://doi.org/10.1007/s00521-015-1870-7
  • Mirjalili, S., Mirjalili, S. M., & Lewis, A. (2014). Grey wolf optimizer. Advances in Engineering Software, 69, 46–61. https://doi.org/10.1016/j.advengsoft.2013.12.007
  • Nagra, A. A., Han, F., Ling, Q. H., Abubaker, M., Ahmad, F., Mehta, S., & Apasiba, A. T. (2020). Hybrid self-inertia weight adaptive particle swarm optimisation with local search using C4.5 decision tree classifier for feature selection problems. Connection Science, 32(1), 16–36. https://doi.org/10.1080/09540091.2019.1609419
  • Nayak, S. K., Rout, P. K., Jagadev, A. K., & Swarnkar, T. (2018). Elitism-based multi-objective differential evolution with extreme learning machine for feature selection: A novel searching technique. Connection Science, 30(4), 362–387. https://doi.org/10.1080/09540091.2018.1487384
  • Nemati, S., Basiri, M. E., Ghasem-Aghaee, N., & Aghdam, M. H. (2009). A novel ACO–GA hybrid algorithm for feature selection in protein function prediction. Expert Systems with Applications, 36(10), 12086–12094. https://doi.org/10.1016/j.eswa.2009.04.023
  • Nguyen, B. H., Xue, B., & Zhang, M. (2020). A survey on swarm intelligence approaches to feature selection in data mining. Swarm and Evolutionary Computation, 54, 100663/1-16. https://doi.org/10.1016/j.swevo.2020.100663
  • Nguyen, T.-K., Ly, V. D., & Hwang, S. O. (2020). Effective feature selection based on MANOVA. International Journal of Internet Technology and Secured Transactions, 10(4), 383–395. https://doi.org/10.1504/IJITST.2020.108133
  • Ouadfel, S., & Abd Elaziz, M. (2020). Enhanced crow search algorithm for feature selection. Expert Systems with Applications, 159, 113572/1-16. https://doi.org/10.1016/j.eswa.2020.113572
  • Rashedi, E., Nezamabadi-pour, H., & Saryazdi, S. (2009). GSA: A gravitational search algorithm. Information Sciences, 179(13), 2232–2248. https://doi.org/10.1016/j.ins.2009.03.004
  • Rashno, A., Nazari, B., Sadri, S., & Saraee, M. (2017). Effective pixel classification of Mars images based on ant colony optimization feature selection and extreme learning machine. Neurocomputing, 226, 66–79. https://doi.org/10.1016/j.neucom.2016.11.030
  • Siedlecki, W., & Sklansky, J. (1989). A note on genetic algorithms for large-scale feature selection. Pattern Recognition Letters, 10(5), 335–347. https://doi.org/10.1016/0167-8655(89)90037-8
  • Sreeja, N. K. (2019). A weighted pattern matching approach for classification of imbalanced data with a fireworks-based algorithm for feature selection. Connection Science, 31(2), 143–168. https://doi.org/10.1080/09540091.2018.1512558
  • Too, J., & Abdullah, A. R. (2020a). Binary atom search optimisation approaches for feature selection. Connection Science, 32(4), 406–430. https://doi.org/10.1080/09540091.2020.1741515
  • Too, J., & Abdullah, A. R. (2020b). A new and fast rival genetic algorithm for feature selection. The Journal of Supercomputing, 1–31. https://doi.org/10.1007/s11227-020-03378-9
  • Too, J., Abdullah, A. R., Mohd Saad, N., & Tee, W. (2019). EMG feature selection and classification using a pbest-guide Binary Particle swarm optimization. Computation, 7(1), 12/1-20. https://doi.org/10.3390/computation7010012
  • Tubishat, M., Idris, N., Shuib, L., Abushariah, M. A. M., & Mirjalili, S. (2020). Improved salp swarm algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Systems with Applications, 145/1-10, 113122. https://doi.org/10.1016/j.eswa.2019.113122
  • UCI Machine Learning Repository. (n.d.). Retrieved March 24, 2019, from https://archive.ics.uci.edu/ml/index.php
  • Wolpert, D. H., & Macready, W. G. (1997). No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1), 67–82. https://doi.org/10.1109/4235.585893
  • Xia, X., Xing, Y., Wei, B., Zhang, Y., Li, X., Deng, X., & Gui, L. (2019). A fitness-based multi-role particle swarm optimization. Swarm and Evolutionary Computation, 44, 349–364. https://doi.org/10.1016/j.swevo.2018.04.006
  • Xue, B., Zhang, M., & Browne, W. N. (2014). Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms. Applied Soft Computing, 18(Supplement C), 261–276. https://doi.org/10.1016/j.asoc.2013.09.018
  • Yapici, H., & Cetinkaya, N. (2019). A new meta-heuristic optimizer: Pathfinder algorithm. Applied Soft Computing, 78, 545–568. https://doi.org/10.1016/j.asoc.2019.03.012
  • Zhang, Y., Jin, Z., & Mirjalili, S. (2020). Generalized normal distribution optimization and its applications in parameter extraction of photovoltaic models. Energy Conversion and Management, 224, 113301. https://doi.org/10.1016/j.enconman.2020.113301
  • Zhang, Y.-D., Zhang, Y., Lv, Y.-D., Hou, X.-X., Liu, F.-Y., Jia, W.-J., Yang, M.-M., Phillips, P., & Wang, S.-H. (2017). Alcoholism detection by medical robots based on Hu moment invariants and predator–prey adaptive-inertia chaotic particle swarm optimization. Computers & Electrical Engineering, 63, 126–138. https://doi.org/10.1016/j.compeleceng.2017.04.009