skip to main content
10.1145/2576768.2598232acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

Published: 12 July 2014 Publication History

Abstract

One of humanity's grand scientific challenges is to create artificially intelligent robots that rival natural animals in intelligence and agility. A key enabler of such animal complexity is the fact that animal brains are structurally organized in that they exhibit modularity and regularity, amongst other attributes. Modularity is the localization of function within an encapsulated unit. Regularity refers to the compressibility of the information describing a structure, and typically involves symmetries and repetition. These properties improve evolvability, but they rarely emerge in evolutionary algorithms without specific techniques to encourage them. It has been shown that (1) modularity can be evolved in neural networks by adding a cost for neural connections and, separately, (2) that the HyperNEAT algorithm produces neural networks with complex, functional regularities. In this paper we show that adding the connection cost technique to HyperNEAT produces neural networks that are significantly more modular, regular, and higher performing than HyperNEAT without a connection cost, even when compared to a variant of HyperNEAT that was specifically designed to encourage modularity. Our results represent a stepping stone towards the goal of producing artificial neural networks that share key organizational properties with the brains of natural animals.

References

[1]
S.B. Carroll. Chance and necessity: the evolution of morphological complexity and diversity. Nature, 409(6823):1102--1109, 2001.
[2]
C. Cherniak, Z. Mokhtarzada, R. Rodriguez-Esteban, and K. Changizi. Global optimization of cerebral cortex layout. Proceedings of the National Academy of Sciences, 101(4):1081--6, January 2004.
[3]
D.B. Chklovskii, T. Schikorski, and C.F. Stevens. Wiring optimization in cortical circuits. Neuron, 34(3):341--347, 2002.
[4]
J. Clune, B.E. Beckmann, P.K. McKinley, and C. Ofria. Investigating whether HyperNEAT produces modular neural networks. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 635--642. ACM, 2010.
[5]
J. Clune, B.E. Beckmann, C. Ofria, and R.T. Pennock. Evolving coordinated quadruped gaits with the HyperNEAT generative encoding. In Proceedings of the IEEE Congress on Evolutionary Computation, pages 2764--2771, 2009.
[6]
J. Clune, B.E. Beckmann, R.T. Pennock, and C. Ofria. HybrID: A Hybridization of Indirect and Direct Encodings for Evolutionary Computation. In Proceedings of the European Conference on Artificial Life, 2009.
[7]
J. Clune, D. Misevic, C. Ofria, R.E. Lenski, S.F. Elena, and R. Sanjuán. Natural selection fails to optimize mutation rates for long-term adaptation on rugged fitness landscapes. PLoS Computational Biology, 4(9):e1000187, 2008.
[8]
J. Clune, J-B. Mouret, and H. Lipson. The evolutionary origins of modularity. Proceedings of the Royal Society B, 280(20122863), 2013.
[9]
J. Clune, C. Ofria, and R.T. Pennock. The sensitivity of HyperNEAT to different geometric representations of a problem. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 675--682, 2009.
[10]
J. Clune, K.O. Stanley, R.T. Pennock, and C. Ofria. On the performance of indirect encoding across the continuum of regularity. IEEE Transactions on Evolutionary Computation, 15(4):346--367, 2011.
[11]
K. Deb, A. Pratap, S. Agarwal, and T.A.M.T. Meyarivan. A fast and elitist multiobjective genetic algorithm: Nsga-ii. Evolutionary Computation, IEEE Transactions on, 6(2):182--197, 2002.
[12]
C. Espinosa-Soto and A. Wagner. Specialization can drive the evolution of modularity. PLoS Computational Biology, 6(3):e1000719, 2010.
[13]
J. Gauci and K.O. Stanley. Generating large-scale neural networks through discovering geometric regularities. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 997--1004. ACM, 2007.
[14]
G.S. Hornby, H. Lipson, and J.B. Pollack. Generative representations for the automated design of modular physical robots. IEEE Transactions on Robotics and Automation, 19(4):703--719, 2003.
[15]
G.S. Hornby and J. B. Pollack. Evolving L-systems to generate virtual creatures. Computers & Graphics, 25(6):1041--1048, December 2001.
[16]
B. Inden, Y. Jin, R. Haschke, and H. Ritter. Exploiting inherent regularity in control of multilegged robot locomotion by evolving neural fields. 2011 Third World Congress on Nature and Biologically Inspired Computing, pages 401--408, October 2011.
[17]
N. Kashtan and U. Alon. Spontaneous evolution of modularity and network motifs. Proceedings of the National Academy of Sciences, 102(39):13773--13778, September 2005.
[18]
N. Kashtan, E. Noor, and U. Alon. Varying environments can speed up evolution. Proceedings of the National Academy of Sciences, 104(34):13711--13716, August 2007.
[19]
C.P. Klingenberg. Developmental constraints, modules and evolvability. Variation: A central concept in biology, pages 1--30, 2005.
[20]
J. Lehman, S. Risi, D.B. D'Ambrosio, and K.O. Stanley. Encouraging reactivity to create robust machines. Adaptive Behavior, 21(6):484--500, August 2013.
[21]
M. Li. An introduction to Kolmogorov complexity and its applications. Springer, 1997.
[22]
H. Lipson. Principles of modularity, regularity, and hierarchy for scalable systems. Journal of Biological Physics and Chemistry, 7(December):125--128, 2007.
[23]
J-B. Mouret and S. Doncieux. Sferes v2: Evolvin' in the multi-core world. IEEE Congress on Evolutionary Computation, (2):1--8, July 2010.
[24]
M.E.J. Newman. Modularity and community structure in networks. Proceedings of the National Academy of Sciences, 103(23):8577--8582, 2006.
[25]
S. Risi, J. Lehman, and K.O. Stanley. Evolving the placement and density of neurons in the hyperneat substrate. Proceedings of the 12th annual conference on Genetic and evolutionary computation - GECCO '10, (Gecco):563, 2010.
[26]
K.O. Stanley. Compositional pattern producing networks: A novel abstraction of development. Genetic Programming and Evolvable Machines, 8(2):131--162, 2007.
[27]
K.O. Stanley, D.B. D'Ambrosio, and J. Gauci. A hypercube-based encoding for evolving large-scale neural networks. Artificial Life, 15(2):185--212, 2009.
[28]
K.O. Stanley and R. Miikkulainen. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2):99--127, 2002.
[29]
K.O. Stanley and R. Miikkulainen. A taxonomy for artificial embryogeny. Artificial Life, 9(2):93--130, 2003.
[30]
G.F. Striedter. Principles of brain evolution. Sinauer Associates Sunderland, MA, 2005.
[31]
M. Suchorzewski. Evolving scalable and modular adaptive networks with Developmental Symbolic Encoding. Evolutionary intelligence, 4(3):145--163, September 2011.
[32]
M. Suchorzewski and J. Clune. A novel generative encoding for evolving modular, regular and scalable networks. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 1523--1530, 2011.
[33]
P. Szerlip and K.O. Stanley. Indirectly Encoded Sodarace for Artificial Life. Advances in Artificial Life, ECAL 2013, pages 218--225, September 2013.
[34]
P. Tonelli and J-B. Mouret. On the relationships between generative encodings, regularity, and learning abilities when evolving plastic, artificial neural networks. PLoS One, page To appear, 2013.
[35]
V.K. Valsalam and R. Miikkulainen. Evolving symmetric and modular neural networks for distributed control. Proceedings of the 11th Annual conference on Genetic and evolutionary computation - GECCO '09, page 731, 2009.
[36]
P. Verbancsics and K.O. Stanley. Constraining connectivity to encourage modularity in hyperneat. In Proceedings of the 13th annual conference on Genetic and evolutionary computation, pages 1483--1490. ACM, 2011.
[37]
G.P. Wagner, J. Mezey, and R. Calabretta. Modularity. Understanding the development and evolution of complex natural systems, chapter Natural selection and the origin of modules. MIT Press, 2001.
[38]
G.P. Wagner, M. Pavlicev, and J.M. Cheverud. The road to modularity. Nature Reviews Genetics, 8(12):921--31, December 2007.

Cited By

View all
  • (2024)Evolving interpretable neural modularity in free-form multilayer perceptrons through connection costsNeural Computing and Applications10.1007/s00521-023-09117-436:3(1459-1476)Online publication date: 1-Jan-2024
  • (2024)Constructing Game Agents Through Simulated EvolutionEncyclopedia of Computer Graphics and Games10.1007/978-3-031-23161-2_15(457-466)Online publication date: 5-Jan-2024
  • (2023)Seeing Is Believing: Brain-Inspired Modular Training for Mechanistic InterpretabilityEntropy10.3390/e2601004126:1(41)Online publication date: 30-Dec-2023
  • Show More Cited By

Index Terms

  1. Evolving neural networks that are both modular and regular: HyperNEAT plus the connection cost technique

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    GECCO '14: Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation
    July 2014
    1478 pages
    ISBN:9781450326629
    DOI:10.1145/2576768
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 July 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. HyperNEAT
    2. NSGA-II
    3. artificial neural networks
    4. modularity
    5. regularity

    Qualifiers

    • Research-article

    Conference

    GECCO '14
    Sponsor:
    GECCO '14: Genetic and Evolutionary Computation Conference
    July 12 - 16, 2014
    BC, Vancouver, Canada

    Acceptance Rates

    GECCO '14 Paper Acceptance Rate 180 of 544 submissions, 33%;
    Overall Acceptance Rate 1,669 of 4,410 submissions, 38%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)13
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 04 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Evolving interpretable neural modularity in free-form multilayer perceptrons through connection costsNeural Computing and Applications10.1007/s00521-023-09117-436:3(1459-1476)Online publication date: 1-Jan-2024
    • (2024)Constructing Game Agents Through Simulated EvolutionEncyclopedia of Computer Graphics and Games10.1007/978-3-031-23161-2_15(457-466)Online publication date: 5-Jan-2024
    • (2023)Seeing Is Believing: Brain-Inspired Modular Training for Mechanistic InterpretabilityEntropy10.3390/e2601004126:1(41)Online publication date: 30-Dec-2023
    • (2023)Emerging Modularity During the Evolution of Neural NetworksJournal of Artificial Intelligence and Soft Computing Research10.2478/jaiscr-2023-001013:2(107-126)Online publication date: 11-Mar-2023
    • (2023)Modularity in Deep Learning: A SurveyIntelligent Computing10.1007/978-3-031-37963-5_40(561-595)Online publication date: 20-Aug-2023
    • (2022)Confusion matrix-based modularity induction into pretrained CNNMultimedia Tools and Applications10.1007/s11042-022-12331-281:16(23311-23337)Online publication date: 18-Mar-2022
    • (2021)A Systematic Literature Review of the Successors of “NeuroEvolution of Augmenting Topologies”Evolutionary Computation10.1162/evco_a_0028229:1(1-73)Online publication date: Mar-2021
    • (2021)Modularized genotype combination to design multiobjective soft-bodied robots2021 IEEE 4th International Conference on Soft Robotics (RoboSoft)10.1109/RoboSoft51838.2021.9479428(295-301)Online publication date: 12-Apr-2021
    • (2021)Hill Climb Modular Assembler EncodingKnowledge-Based Systems10.1016/j.knosys.2021.107493232:COnline publication date: 28-Nov-2021
    • (2021)Utilizing the Untapped Potential of Indirect Encoding for Neural Networks with Meta LearningApplications of Evolutionary Computation10.1007/978-3-030-72699-7_34(537-551)Online publication date: 1-Apr-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media