ABSTRACT
A significant problem for evolving artificial neural networks is that the physical arrangement of sensors and effectors is invisible to the evolutionary algorithm. For example, in this paper, directional sensors and effectors are placed around the circumference of a robot in analogous arrangements. This configuration ensures that there is a useful geometric correspondence between sensors and effectors. However, if sensors are mapped to a single input layer and the effectors to a single output layer (as is typical), evolution has no means to exploit this fortuitous arrangement. To address this problem, this paper presents a novel generative encoding called connective Compositional Pattern Producing Networks (connective CPPNs) that can effectively detect and capitalize on geometric relationships among sensors and effectors. The key insight is that sensors and effectors with consistent geometric relationships can be exploited by a repeating motif in the neural architecture. Thus, by employing an encoding that can discover such motifs as a function of network geometry, it becomes possible to exploit it. In this paper, a method for evolving connective CPPNs called Hypercube-based Neuroevolution of Augmenting Topologies (HyperNEAT) discovers sensible repeating motifs that take advantage of two different placement schemes, demonstrating the utility of such an approach.
- P. J. Angeline. Morphogenic evolutionary computations: Introduction, issues and examples. In J. R. McDonnell, R. G. Reynolds, and D. B. Fogel, editors, Evolutionary Programming IV: The Fourth Annual Conference on Evolutionary Programming, pages 387--401. MIT Press, 1995.Google Scholar
- P. J. Bentley and S. Kumar. The ways to grow designs: A comparison of embryogenies for an evolutionary design problem. In Proc. of the Genetic and Evol. Comp. Conf. (GECCO-1999), pages 35--43, San Francisco, 1999. Kaufmann.Google Scholar
- J. C. Bongard. Evolving modular genetic regulatory networks. In Proceedings of the 2002 Congress on Evol. Comp., 2002. Google ScholarDigital Library
- D. B. Chklovskii and A. A. Koulakov. MAPS IN THE BRAIN: What can we learn from them? Annual Review of Neuroscience, 27:369--392, 2004.Google ScholarCross Ref
- D. Federici. Using embryonic stages to increase the evolvability of development. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2004) Workshop Program, Berlin, 2004. Springer Verlag.Google Scholar
- G. J. Goodhill and M. A. Carreira-Perpinn. Cortical columns. In L. Nadel, editor, Encyc. of Cognitive Science, volume 1, pages 845--851. MacMillan Publishers Ltd., London, 2002.Google Scholar
- C. Green. SharpNEAT homepage. http://sharpneat.sourceforge.net/, 2003--2006.Google Scholar
- G. S. Hornby and J. B. Pollack. Creating high-level components with a generative representation for body-brain evolution. Artificial Life, 8(3), 2002. Google ScholarDigital Library
- A. Lindenmayer. Adding continuous components to L-systems. In G. Rozenberg and A. Salomaa, editors, L Systems, Lecture Notes in Computer Science 15, pages 53--68. Springer-Verlag, Heidelberg, Germany, 1974. Google ScholarDigital Library
- A. P. Martin. Increasing genomic complexity by gene duplication and the origin of vertebrates. The American Naturalist, 154(2):111--128, 1999.Google ScholarCross Ref
- J. F. Miller. Evolving a self-repairing, self-regulating, French flag organism. In Proc. of the Genetic and Evol. Comp. Conf. (GECCO-2004), Berlin, 2004. Springer Verlag.Google ScholarCross Ref
- E. Mjolsness, D. H. Sharp, and J. Reinitz. A connectionist model of development. Journal of Theoretical Biology, 152:429--453, 1991.Google ScholarCross Ref
- O. Sporns. Network analysis, complexity, and brain function. Complexity, 8(1):56--60, 2002. Google ScholarDigital Library
- K. O. Stanley. Comparing artificial phenotypes with natural biological patterns. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) Workshop Program, New York, NY, 2006. ACM Press.Google Scholar
- K. O. Stanley. Exploiting regularity without development. In Proceedings of the AAAI Fall Symposium on Developmental Systems, Menlo Park, CA, 2006. AAAI Press.Google Scholar
- K. O. Stanley. Compositional pattern producing networks: A novel abstraction of development. Genetic Programming and Evolvable Machines Special Issue on Developmental Systems, 2007. To appear. Google ScholarDigital Library
- K. O. Stanley, B. D. Bryant, and R. Miikkulainen. Real-time neuroevolution in the NERO video game. IEEE Transactions on Evolutionary Computation Special Issue on Evolutionary Computation and Games, 9(6):653--668, 2005. Google ScholarDigital Library
- K. O. Stanley and R. Miikkulainen. Evolving neural networks through augmenting topologies. Evol. Comp., 10:99--127, 2002. Google ScholarDigital Library
- K. O. Stanley and R. Miikkulainen. A taxonomy for artificial embryogeny. Artificial Life, 9(2):93--130, 2003. Google ScholarDigital Library
- K. O. Stanley and R. Miikkulainen. Competitive coevolution through evolutionary complexification. Journal of Artificial Intelligence Research, 21:63--100, 2004. Google ScholarDigital Library
- A. Turing. The chemical basis of morphogenesis. Philosophical Transactions of the Royal Society B, 237:37--72, 1952.Google ScholarCross Ref
- X. Yao. Evolving artificial neural networks. Proceedings of the IEEE, 87(9):1423--1447, 1999.Google ScholarCross Ref
- M. J. Zigmond, F. E. Bloom, S. C. Landis, J. L. Roberts, and L. R. Squire, editors. Fundamental Neuroscience. Academic Press, London, 1999.Google Scholar
Index Terms
- A novel generative encoding for exploiting neural network sensor and output geometry
Recommendations
Generating large-scale neural networks through discovering geometric regularities
GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computationConnectivity patterns in biological brains exhibit many repeating motifs. This repetition mirrors inherent geometric regularities in the physical world. For example, stimuli that excite adjacent locations on the retina map to neurons that are similarly ...
Evolving neural networks
GECCO '12: Proceedings of the 14th annual conference companion on Genetic and evolutionary computationNeuroevolution, i.e. evolution of artificial neural networks, has recently emerged as a powerful technique for solving challenging reinforcement learning problems. Compared to traditional (e.g. value-function based) methods, neuroevolution is especially ...
Investigating whether hyperNEAT produces modular neural networks
GECCO '10: Proceedings of the 12th annual conference on Genetic and evolutionary computationHyperNEAT represents a class of neuroevolutionary algorithms that captures some of the power of natural development with a computationally efficient high-level abstraction of development. This class of algorithms is intended to provide many of the ...
Comments