skip to main content
10.1145/3520304.3528794acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
poster

Towards optimizing neural networks' connectivity and architecture simultaneously with feature selection

Published: 19 July 2022 Publication History

Abstract

We propose an extension of NeuroEvolution of Augmenting Topologies (NEAT), called Heterogeneous Activation Feature Deselection NEAT (HA-FD-NEAT) that evolves the weights and topography (architecture and activation functions) of Artificial Neural Networks (ANNs) while performing feature selection. The algorithm is evaluated against its ancestors: NEAT that evolves the weights and topology, FD-NEAT that evolves the weights and the topology while performing feature selection and HA-NEAT that evolves the topography and the weights. Performance is described by (i) median classification accuracy, (ii) computational efficiency (number of generations), (iii) network complexity (number of nodes and connections) and (iv) ability of selecting the relevant features. On the artificial 2-out-100 XOR problem, used as benchmark for feature selection, HA-FD-NEAT reaches a 100% accuracy in a few generations. It is significantly better than NEAT and HA-NEAT and exhibits the same performance as FD-NEAT. Even though HA-FD-NEAT needs to search in a bigger search space that includes weights, activation functions, topology and inputs, it is able to achieve the same performance as FD-NEAT. To conclude, the proposed method reduces the burden of human designers by determining the network's inputs, weights, topology and activation functions while achieving a very good performance.

References

[1]
Lluıs A Belanche and Félix Fernando González. 2011. Review and evaluation of feature selection algorithms in synthetic problems. arXiv preprint arXiv:1101.2320 (2011).
[2]
Verónica Bolón-Canedo, Noelia Sánchez-Maroño, and Amparo Alonso-Betanzos. 2013. A review of feature selection methods on synthetic data. Knowledge and information systems 34, 3 (2013), 483--519.
[3]
Bhaskar DasGupta and Georg Schnitger. 1992. Efficient approximation with neural networks: A comparison of gate functions. Pennsylvania State University, Department of Computer Science.
[4]
Wlodzislaw Duch and Norbert Jankowski. 2001. Transfer functions: hidden possibilities for better neural networks. In ESANN. 81--94.
[5]
Isabelle Guyon and André Elisseeff. 2003. An introduction to variable and feature selection. Journal of machine learning research 3, Mar (2003), 1157--1182.
[6]
Alexander Hagg. 2017. haneat. https://github.com/alexander-hagg/haneat-gecco2017.
[7]
Alexander Hagg, Maximilian Mensing, and Alexander Asteroth. 2017. Evolving Parsimonious Networks by Mixing Activation Functions. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '17). ACM, 425--432.
[8]
Manuel López-Ibáñez, Jérémie Dubois-Lacoste, Leslie Pérez Cáceres, Mauro Birattari, and Thomas Stützle. 2016. The irace package: Iterated racing for automatic algorithm configuration. Operations Research Perspectives 3 (2016), 43--58.
[9]
E. Papavasileiou, J. Cornelis, and B. Jansen. 2020. Behavior-based Speciation in Classification with NeuroEvolution. In 2020 IEEE Congress on Evolutionary Computation (CEC). 1--7.
[10]
Evgenia Papavasileiou, Jan Cornelis, and Bart Jansen. 2021. A Systematic Literature Review of the Successors of "NeuroEvolution of Augmenting Topologies". Evolutionary Computation 29, 1 (2021), 1--73.
[11]
Kenneth O Stanley and Risto Miikkulainen. 2002. Evolving neural networks through augmenting topologies. Evolutionary Computation 10, 2 (2002), 99--127.
[12]
Maxine Tan, Michael Hartley, Michel Bister, and Rudi Deklerck. 2009. Automated feature selection in neuroevolution. Evolutionary Intelligence 1, 4 (2009), 271--292.

Index Terms

  1. Towards optimizing neural networks' connectivity and architecture simultaneously with feature selection

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        GECCO '22: Proceedings of the Genetic and Evolutionary Computation Conference Companion
        July 2022
        2395 pages
        ISBN:9781450392686
        DOI:10.1145/3520304
        Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

        Sponsors

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 19 July 2022

        Check for updates

        Author Tags

        1. activation function
        2. feature selection
        3. neuroevolution
        4. supervised learning
        5. transfer function

        Qualifiers

        • Poster

        Conference

        GECCO '22
        Sponsor:

        Acceptance Rates

        Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • 0
          Total Citations
        • 52
          Total Downloads
        • Downloads (Last 12 months)5
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 20 Jan 2025

        Other Metrics

        Citations

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media