Skip to main content

An Evolutionary Approach to Feature Selection and Classification

  • Conference paper
  • First Online:
Machine Learning, Optimization, and Data Science (LOD 2023)

Abstract

The feature selection problem has become a key undertaking within machine learning. For classification problems, it is known to reduce the computational complexity of parameter estimation, but it also adds an important contribution to the explainability aspects of the results. An evolution strategy for feature selection is proposed in this paper. Feature weights are evolved with decision trees that use the Nash equilibrium concept to split node data. Trees are maintained until the variation in probabilities induced by feature weights stagnates. Predictions are made based on the information provided by all the trees. Numerical experiments illustrate the performance of the approach compared to other classification methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    version 1.1.1.

  2. 2.

    https://datahelpdesk.worldbank.org/knowledgebase/articles/378832-what-is-the-world-bank-atlas-method, last accessed January 2023.

  3. 3.

    https://blogs.worldbank.org/opendata/new-world-bank-country-classifications-income-level-2022-2023, accessed Jan. 2023.

References

  1. Aich, S., Younga, K., Hui, K.L., Al-Absi, A.A., Sain, M.: A nonlinear decision tree based classification approach to predict the Parkinson’s disease using different feature sets of voice data. In: 2018 20th International Conference on Advanced Communication Technology (ICACT), pp. 638–642 (2018)

    Google Scholar 

  2. Bala, J., Huang, J., Vafaie, H., Dejong, K., Wechsler, H.: Hybrid learning using genetic algorithms and decision trees for pattern classification. In: Proceedings of the 14th International Joint Conference on Artificial Intelligence, IJCAI 1995, vol. 1, p. 719–724. Morgan Kaufmann Publishers Inc., San Francisco (1995)

    Google Scholar 

  3. Bommert, A., Sun, X., Bischl, B., Rahnenführer, J., Lang, M.: Benchmark for filter methods for feature selection in high-dimensional classification data. Comput. Stat. Data Anal. 143, 106839 (2020). https://doi.org/10.1016/j.csda.2019.106839

    Article  MathSciNet  Google Scholar 

  4. Breiman, L.: Random Forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  Google Scholar 

  5. Brown, G.W.: Iterative solution of games by fictitious play. Act. Anal. Prod. Allocation 13(1), 374–376 (1951)

    MathSciNet  Google Scholar 

  6. Cai, J., Luo, J., Wang, S., Yang, S.: Feature selection in machine learning: a new perspective. Neurocomputing 300, 70–79 (2018)

    Article  Google Scholar 

  7. Dua, D., Graff, C.: UCI machine learning repository (2017)

    Google Scholar 

  8. Fawcett, T.: An introduction to ROC analysis. Pattern Recogn. Lett. 27(8), 861–874 (2006). https://doi.org/10.1016/j.patrec.2005.10.010

    Article  MathSciNet  Google Scholar 

  9. Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning, 1st edn. Addison-Wesley Longman Publishing Co., Inc., USA (1989)

    Google Scholar 

  10. Hansen, L., Lee, E.A., Hestir, K., Williams, L.T., Farrelly, D.: Controlling feature selection in random forests of decision trees using a genetic algorithm: classification of class I MHC peptides. Combin. Chem. High Throughput Screen. 12(5), 514–519 (2009). https://doi.org/10.2174/138620709788488984

    Article  Google Scholar 

  11. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference and Prediction, 2nd edn. Springer, Heidelberg (2009). https://doi.org/10.1007/978-0-387-84858-7

    Book  Google Scholar 

  12. Irsoy, O., Yıldız, O.T., Alpaydın, E.: Soft decision trees. In: Proceedings of the 21st International Conference on Pattern Recognition (ICPR2012), pp. 1819–1822. IEEE (2012)

    Google Scholar 

  13. Jovanovic, M., Delibasic, B., Vukicevic, M., Suknović, M., Martic, M.: Evolutionary approach for automated component-based decision tree algorithm design. Intell. Data Anal. (2014). https://doi.org/10.3233/ida-130628

    Article  Google Scholar 

  14. Krętowski, M., Grześ, M.: Evolutionary learning of linear trees with embedded feature selection. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds.) ICAISC 2006. LNCS (LNAI), vol. 4029, pp. 400–409. Springer, Heidelberg (2006). https://doi.org/10.1007/11785231_43

    Chapter  Google Scholar 

  15. Mao, Q., Wang, X., Zhan, Y.: Speech emotion recognition method based on improved decision tree and layered feature selection. Int. J. Humanoid Rob. (2010). https://doi.org/10.1142/s0219843610002088

    Article  Google Scholar 

  16. Murthy, S.K., Kasif, S., Salzberg, S.: A system for induction of oblique decision trees. J. Artif. Intell. Res. 2, 1–32 (1994)

    Article  Google Scholar 

  17. Nogueira, S., Brown, G.: Measuring the stability of feature selection. In: Frasconi, P., Landwehr, N., Manco, G., Vreeken, J. (eds.) ECML PKDD 2016. LNCS (LNAI), vol. 9852, pp. 442–457. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46227-1_28

    Chapter  Google Scholar 

  18. Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  Google Scholar 

  19. Quinlan, J.R.: Induction of decision trees. Mach. Learn. (1986). https://doi.org/10.1007/bf00116251

    Article  Google Scholar 

  20. Rosset, S.: Model selection via the AUC. In: Proceedings of the Twenty-First International Conference on Machine Learning, ICML 2004, p. 89. Association for Computing Machinery, New York (2004). https://doi.org/10.1145/1015330.1015400

  21. Stein, G., Chen, B., Wu, A.S., Hua, K.A.: Decision tree classifier for network intrusion detection with GA-based feature selection. In: Proceedings of the 43rd Annual Southeast Regional Conference, vol. 2, p. 136–141. ACM-SE 43, Association for Computing Machinery, New York (2005). https://doi.org/10.1145/1167253.1167288

  22. Suciu, M.A., Lung, R.: A new filter feature selection method based on a game theoretic decision tree. In: Abraham, A., Hong, T.P., Kotecha, K., Ma, K., Manghirmalani Mishra, P., Gandhi, N. (eds.) HIS 2022. LNNS, vol. 647, pp. 556–565. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-27409-1_50

    Chapter  Google Scholar 

  23. Vafaie, H., De Jong, K.: Genetic algorithms as a tool for feature selection in machine learning. In: Proceedings Fourth International Conference on Tools with Artificial Intelligence, TAI 1992, pp. 200–203 (1992). https://doi.org/10.1109/TAI.1992.246402

  24. Wang, S., Tang, J., Liu, H.: Embedded unsupervised feature selection. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 29, no. 1 (2015)

    Google Scholar 

  25. Wu, X., et al.: Top 10 algorithms in data mining. Knowl. Inf. Syst. 14(1), 1–37 (2008). https://doi.org/10.1007/s10115-007-0114-2

    Article  Google Scholar 

  26. Xue, B., Cervante, L., Shang, L., Browne, W.N., Zhang, M.: Multi-objective evolutionary algorithms for filter based feature selection in classification. Int. J. Artif. Intell. Tools 22(04), 1350024 (2013)

    Article  Google Scholar 

  27. Xue, B., Zhang, M., Browne, W.N., Yao, X.: A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20(4), 606–626 (2016). https://doi.org/10.1109/TEVC.2015.2504420

    Article  Google Scholar 

  28. Zaki, M.J., Meira, W., Jr.: Data Mining and Machine Learning: Fundamental Concepts and Algorithms, 2nd edn. Cambridge University Press, Cambridge (2020)

    Book  Google Scholar 

  29. Zijdenbos, A., Dawant, B., Margolin, R., Palmer, A.: Morphometric analysis of white matter lesions in MR images: method and validation. IEEE Trans. Med. Imaging 13(4), 716–724 (1994). https://doi.org/10.1109/42.363096

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by a grant of the Ministry of Research, Innovation and Digitization, CNCS - UEFISCDI, project number PN-III-P1-1.1-TE-2021-1374, within PNCDI III.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mihai-Alexandru Suciu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lung, R.I., Suciu, MA. (2024). An Evolutionary Approach to Feature Selection and Classification. In: Nicosia, G., Ojha, V., La Malfa, E., La Malfa, G., Pardalos, P.M., Umeton, R. (eds) Machine Learning, Optimization, and Data Science. LOD 2023. Lecture Notes in Computer Science, vol 14505. Springer, Cham. https://doi.org/10.1007/978-3-031-53969-5_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-53969-5_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-53968-8

  • Online ISBN: 978-3-031-53969-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics