Skip to main content

Advertisement

Log in

A new and fast rival genetic algorithm for feature selection

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Feature selection is one of the significant steps in classification tasks. It is a pre-processing step to select a small subset of significant features that can contribute the most to the classification process. Presently, many metaheuristic optimization algorithms were successfully applied for feature selection. The genetic algorithm (GA) as a fundamental optimization tool has been widely used in feature selection tasks. However, GA suffers from the hyperparameter setting, high computational complexity, and the randomness of selection operation. Therefore, we propose a new rival genetic algorithm, as well as a fast version of rival genetic algorithm, to enhance the performance of GA in feature selection. The proposed approaches utilize the competition strategy that combines the new selection and crossover schemes, which aim to improve the global search capability. Moreover, a dynamic mutation rate is proposed to enhance the search behaviour of the algorithm in the mutation process. The proposed approaches are validated on 23 benchmark datasets collected from the UCI machine learning repository and Arizona State University. In comparison with other competitors, proposed approach can provide highly competing results and overtake other algorithms in feature selection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Xue B, Zhang M, Browne WN (2014) Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl Soft Comput 18:261–276. https://doi.org/10.1016/j.asoc.2013.09.018

    Article  Google Scholar 

  2. Faris H, Mafarja MM, Heidari AA et al (2018) An efficient binary Salp Swarm Algorithm with crossover scheme for feature selection problems. Knowl Based Syst 154:43–67. https://doi.org/10.1016/j.knosys.2018.05.009

    Article  Google Scholar 

  3. Emary E, Zawbaa HM, Hassanien AE (2016) Binary ant lion approaches for feature selection. Neurocomputing 213:54–65. https://doi.org/10.1016/j.neucom.2016.03.101

    Article  Google Scholar 

  4. Faris H, Hassonah MA, Al-Zoubi AM et al (2017) A multi-verse optimizer approach for feature selection and optimizing SVM parameters based on a robust system architecture. Neural Comput Appl. https://doi.org/10.1007/s00521-016-2818-2

    Article  Google Scholar 

  5. Zawbaa HM, Emary E, Grosan C (2016) Feature selection via chaotic antlion optimization. PLoS ONE 11:e0150652. https://doi.org/10.1371/journal.pone.0150652

    Article  Google Scholar 

  6. Aljarah I, Mafarja M, Heidari AA et al (2018) Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 71:964–979. https://doi.org/10.1016/j.asoc.2018.07.040

    Article  Google Scholar 

  7. Labani M, Moradi P, Ahmadizar F, Jalili M (2018) A novel multivariate filter method for feature selection in text classification problems. Eng Appl Artif Intell 70:25–37. https://doi.org/10.1016/j.engappai.2017.12.014

    Article  Google Scholar 

  8. Mafarja M, Aljarah I, Faris H et al (2019) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286. https://doi.org/10.1016/j.eswa.2018.09.015

    Article  Google Scholar 

  9. Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381. https://doi.org/10.1016/j.neucom.2015.06.083

    Article  Google Scholar 

  10. De Stefano C, Fontanella F, Marrocco C, Scotto di Freca A (2014) A GA-based feature selection approach with an application to handwritten character recognition. Pattern Recognit Lett 35:130–141. https://doi.org/10.1016/j.patrec.2013.01.026

    Article  Google Scholar 

  11. Xue B, Zhang M, Browne WN (2013) Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans Cybern 43:1656–1671. https://doi.org/10.1109/TSMCB.2012.2227469

    Article  Google Scholar 

  12. Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98. https://doi.org/10.1016/j.advengsoft.2015.01.010

    Article  Google Scholar 

  13. Jiang S, Chin K-S, Wang L et al (2017) Modified genetic algorithm-based feature selection combined with pre-trained deep neural network for demand forecasting in outpatient department. Expert Syst Appl 82:216–230. https://doi.org/10.1016/j.eswa.2017.04.017

    Article  Google Scholar 

  14. Gu S, Cheng R, Jin Y (2018) Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Comput 22:811–822. https://doi.org/10.1007/s00500-016-2385-6

    Article  Google Scholar 

  15. AbdEl-Fattah Sayed S, Nabil E, Badr A (2016) A binary clonal flower pollination algorithm for feature selection. Pattern Recognit Lett 77:21–27. https://doi.org/10.1016/j.patrec.2016.03.014

    Article  Google Scholar 

  16. Adeli A, Broumandnia A (2018) Image steganalysis using improved particle swarm optimization based feature selection. Appl Intell 48:1609–1622. https://doi.org/10.1007/s10489-017-0989-x

    Article  Google Scholar 

  17. Chen Y-P, Li Y, Wang G et al (2017) A novel bacterial foraging optimization algorithm for feature selection. Expert Syst Appl 83:1–17. https://doi.org/10.1016/j.eswa.2017.04.019

    Article  Google Scholar 

  18. Ghareb AS, Bakar AA, Hamdan AR (2016) Hybrid feature selection based on enhanced genetic algorithm for text categorization. Expert Syst Appl 49:31–47. https://doi.org/10.1016/j.eswa.2015.12.004

    Article  Google Scholar 

  19. Hancer E, Xue B, Karaboga D, Zhang M (2015) A binary ABC algorithm based on advanced similarity scheme for feature selection. Appl Soft Comput 36:334–348. https://doi.org/10.1016/j.asoc.2015.07.023

    Article  Google Scholar 

  20. Mafarja M, Mirjalili S (2018) Whale optimization approaches for wrapper feature selection. Appl Soft Comput 62:441–453. https://doi.org/10.1016/j.asoc.2017.11.006

    Article  Google Scholar 

  21. Mirhosseini M, Nezamabadi-pour H (2018) BICA: a binary imperialist competitive algorithm and its application in CBIR systems. Int J Mach Learn Cybern 9:2043–2057. https://doi.org/10.1007/s13042-017-0686-4

    Article  Google Scholar 

  22. Too J, Abdullah AR, Mohd Saad N et al (2018) A new competitive binary grey wolf optimizer to solve the feature selection problem in EMG signals classification. Computers 7:58. https://doi.org/10.3390/computers7040058

    Article  Google Scholar 

  23. Krömer P, Platoš J, Nowaková J, Snášel V (2018) Optimal column subset selection for image classification by genetic algorithms. Ann Oper Res 265:205–222. https://doi.org/10.1007/s10479-016-2331-0

    Article  MathSciNet  MATH  Google Scholar 

  24. Whitley D (1994) A genetic algorithm tutorial. Stat Comput 4:65–85. https://doi.org/10.1007/BF00175354

    Article  Google Scholar 

  25. Huang C-L, Wang C-J (2006) A GA-based feature selection and parameters optimization for support vector machines. Expert Syst Appl 31:231–240. https://doi.org/10.1016/j.eswa.2005.09.024

    Article  Google Scholar 

  26. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1:67–82. https://doi.org/10.1109/4235.585893

    Article  Google Scholar 

  27. Ma B, Xia Y (2017) A tribe competition-based genetic algorithm for feature selection in pattern classification. Appl Soft Comput 58:328–338. https://doi.org/10.1016/j.asoc.2017.04.042

    Article  Google Scholar 

  28. Al-Sharhan S, Bimba A (2019) Adaptive multi-parent crossover GA for feature optimization in epileptic seizure identification. Appl Soft Comput 75:575–587. https://doi.org/10.1016/j.asoc.2018.11.012

    Article  Google Scholar 

  29. Jude Hemanth D, Anitha J (2019) Modified genetic algorithm approaches for classification of abnormal magnetic resonance brain tumour images. Appl Soft Comput 75:21–28. https://doi.org/10.1016/j.asoc.2018.10.054

    Article  Google Scholar 

  30. UCI machine learning repository. https://archive.ics.uci.edu/ml/index.php. Accessed 24 Mar 2019

  31. Datasets|Feature selection @ ASU. http://featureselection.asu.edu/datasets.php. Accessed 3 Oct 2019

  32. Zhang Y, Song X, Gong D (2017) A return-cost-based binary firefly algorithm for feature selection. Inf Sci 418–419:561–574. https://doi.org/10.1016/j.ins.2017.08.047

    Article  Google Scholar 

  33. Hancer E, Xue B, Zhang M et al (2018) Pareto front feature selection based on artificial bee colony optimization. Inf Sci 422:462–479. https://doi.org/10.1016/j.ins.2017.09.028

    Article  Google Scholar 

  34. Zhang L, Shan L, Wang J (2017) Optimal feature selection using distance-based discrete firefly algorithm with mutual information criterion. Neural Comput Appl 28:2795–2808. https://doi.org/10.1007/s00521-016-2204-0

    Article  Google Scholar 

  35. Konak A, Coit DW, Smith AE (2006) Multi-objective optimization using genetic algorithms: a tutorial. Reliab Eng Syst Saf 91:992–1007. https://doi.org/10.1016/j.ress.2005.11.018

    Article  Google Scholar 

  36. Keshanchi B, Souri A, Navimipour NJ (2017) An improved genetic algorithm for task scheduling in the cloud environments using the priority queues: formal verification, simulation, and statistical testing. J Syst Softw 124:1–21. https://doi.org/10.1016/j.jss.2016.07.006

    Article  Google Scholar 

  37. Holland JH (1992) Genetic algorithms. Sci Am 267:66–73

    Article  Google Scholar 

  38. Cheng R, Jin Y (2015) A competitive swarm optimizer for large scale optimization. IEEE Trans Cybern 45:191–204. https://doi.org/10.1109/TCYB.2014.2322602

    Article  Google Scholar 

  39. Eshtay M, Faris H, Obeid N (2018) Improving extreme learning machine by competitive swarm optimization and its application for medical diagnosis problems. Expert Syst Appl 104:134–152. https://doi.org/10.1016/j.eswa.2018.03.024

    Article  Google Scholar 

  40. Neggaz N, Houssein EH, Hussain K (2020) An efficient henry gas solubility optimization for feature selection. Expert Syst Appl 152:113364. https://doi.org/10.1016/j.eswa.2020.113364

    Article  Google Scholar 

  41. Gupta D, Sundaram S, Khanna A et al (2018) Improved diagnosis of Parkinson’s disease using optimized crow search algorithm. Comput Electr Eng 68:412–424. https://doi.org/10.1016/j.compeleceng.2018.04.014

    Article  Google Scholar 

  42. Chuang L-Y, Chang H-W, Tu C-J, Yang C-H (2008) Improved binary PSO for feature selection using gene expression data. Comput Biol Chem 32:29–38. https://doi.org/10.1016/j.compbiolchem.2007.09.005

    Article  MATH  Google Scholar 

  43. Karaboga D, Basturk B (2007) A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J Glob Optim 39:459–471. https://doi.org/10.1007/s10898-007-9149-x

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the Skim Zamalah UTeM for supporting this research.

Author information

Authors and Affiliations

Authors

Contributions

JT took part in conceptualization, methodology, formal analysis and investigation, writing the original draft, and resource allocation; JT and ARA were involved in writing, reviewing, and editing; ARA conducted supervision.

Corresponding author

Correspondence to Jingwei Too.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Too, J., Abdullah, A.R. A new and fast rival genetic algorithm for feature selection. J Supercomput 77, 2844–2874 (2021). https://doi.org/10.1007/s11227-020-03378-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-020-03378-9

Keywords

Navigation