ABSTRACT
Generative representations allow the reuse of code and thus facilitate the evolution of repeated phenotypic themes or modules. It has been shown that generative representations perform well on highly regular problems. To date, however, generative representations have not been tested on irregular problems. It is unknown how fast their performance degrades as the regularity of the problem decreases. In this report, we test a generative representation on a problem where we can scale a type of regularity in the problem. The generative representation outperforms a direct encoding control when the regularity of the problem is high but degrades to, and then underperforms, the direct control as the regularity of the problem decreases. Importantly, this decrease is not linear. The boost provided by the generative encoding is only significant for very high levels of regularity.
- Stanley, K. O. Miikkulainen. A taxonomy for artificial embryogeny. Artificial Life, 9(2): 93--130, 2003. Google ScholarDigital Library
- D'Ambrosio, D. B. and Stanley, K. O. A novel generative encoding for exploiting neural network sensor and output geometry. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO .07) (London, U.K., July 7-11, 2007). ACM Press, New York, NY, 2007, 974--981. Google ScholarDigital Library
- Gauci, J. J. and Stanley, K. O. Generating large-scale neural networks through discovering geometric regularities. In Proceedings of the Genetic and Evolutionary Computation Conference (GECCO '07) (London, U.K., July 7-11, 2007). ACM Press, New York, NY, 2007, 997--1004. Google ScholarDigital Library
- Stanley, K. O. and Miikkulainen, R. Evolving neural networks through augmenting topologies. Evolutionary Computation, 10(2): 99--127, 2002. Google ScholarDigital Library
Index Terms
- How generative encodings fare on less regular problems
Recommendations
How a Generative Encoding Fares as Problem-Regularity Decreases
Proceedings of the 10th International Conference on Parallel Problem Solving from Nature --- PPSN X - Volume 5199It has been shown that generative representations, which allow the reuse of code, perform well on problems with high regularity i.e. where a phenotypic motif must be repeated many times. To date, however, generative representations have not been tested ...
Investigating whether hyperNEAT produces modular neural networks
GECCO '10: Proceedings of the 12th annual conference on Genetic and evolutionary computationHyperNEAT represents a class of neuroevolutionary algorithms that captures some of the power of natural development with a computationally efficient high-level abstraction of development. This class of algorithms is intended to provide many of the ...
A novel generative encoding for evolving modular, regular and scalable networks
GECCO '11: Proceedings of the 13th annual conference on Genetic and evolutionary computationIn this paper we introduce the Developmental Symbolic Encoding (DSE), a new generative encoding for evolving networks (e.g. neural or boolean). DSE combines elements of two powerful generative encodings, Cellular Encoding and HyperNEAT, in order to ...
Comments