ABSTRACT
Evolutionary algorithms with computationally expensive fitness evaluations typically have smaller evaluation budgets and population sizes. However, smaller populations and fewer evaluations mean that the problem space may not be effectively explored. An evolutionary algorithm may be combined with a machine learner to compensate for these smaller populations and evaluations to increase the likelihood of finding viable solutions. Learnable Evolution Model (LEM) is such an evolutionary algorithm (EA) and machine learner (ML) hybrid that infers rules from best- and least-fit individuals and then exploits these rules when creating offspring. This paper shows that LEM introduces a unique form of emergent selection pressure that is separate from any selection pressure induced by parent or survivor selection. Additionally this work shows that this selection pressure can be attenuated by how the best and least fit subsets are chosen, and by how long learned rules are kept. Practitioners need to be aware of this novel form of selection pressure and these means of adjusting it to ensure their LEM implementations are adequately tuned. That is, too much selection pressure may mean premature convergence to inferior solutions while insufficient selection pressure may mean no sufficient solutions are found.
- R. S. Michalski. Machine Learning, volume 38, chapter LEARNABLE EVOLUTION MODEL: Evolutionary Processes Guided by Machine Learning, pages 9--40. Kluwer Academic Publishers, 2000. Google ScholarDigital Library
- Kenneth De Jong. Evolutionary Computation: A Unified Approach. The MIT Press, The MIT Press, 55 Hayward St., Cambridge, MA 02142, 2006.Google Scholar
- J. R. Quinlan. C4.5, Programs for Machine Learning. Morgan Kaufmann Publishers, 1993.Google Scholar
Index Terms
- The effects of training set size and keeping rules on the emergent selection pressure of learnable evolution model
Recommendations
Learnable evolution model performance impaired by binary tournament survival selection
GECCO '09: Proceedings of the 11th Annual Conference Companion on Genetic and Evolutionary Computation Conference: Late Breaking PapersA Cultural Algorithm (CA) is an evolutionary algorithm augmented by a machine learner. The machine learner updates a knowledgebase with each generation based on bes and least fit individuals. Using a stochastic selection operator may eliminate critical "...
The LEM3 implementation of learnable evolution model and its testing on complex function optimization problems
GECCO '06: Proceedings of the 8th annual conference on Genetic and evolutionary computationLearnable Evolution Model (LEM) is a form of non-Darwinian evolutionary computation that employs machine learning to guide evolutionary processes. Its main novelty are new type of operators for creating new individuals, specifically, hypothesis ...
Estimation of fitness landscape contours in EAs
GECCO '07: Proceedings of the 9th annual conference on Genetic and evolutionary computationEvolutionary algorithms applied in real domain should profit from information about the local fitness function curvature. This paper presents an initial study of an evolutionary strategy with a novel approach for learning the covariance matrix of a ...
Comments