Abstract
The evolutionary algorithms that use probabilistic graphical models to represent properties of selected solutions are known as Distribution Estimation Algorithms (DEAs). Work on such algorithms has generally focused on the complexity of the models used. Here, the performance of two DEAs is investigated. One takes problem variables to be independent while the other uses pairwise conditional probabilities to generate a chain in which each variable conditions another. Three problems are considered that differ in the extent to which they impose a chain-like structure on variables. The more complex algorithm performs better on a function that exactly matches the structure of its model. However, on other problems, the selection mechanism is seen to be crucial, some previously reported gains for the more complex algorithm are shown to be unfounded and, with comparable mechanisms, the simpler algorithm gives better results. Some preliminary explanations of the dynamics of the algorithms are also offered.
The first author is supported by a UK EPSRC Studentship.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
M. Pelikan, D. Goldberg, and F. Lob. A survey of optimization by building and using probabilistic models. Technical Report 99018, University of Illinois at Urbana Champaign, Illinois Genetic Algorithms Laboratory, 1999.
S. Baluja and R. Caruana. Removing the genetics from the standard genetic algorithm. In A. Prieditis and S. Russell, editors, Proceedings of ML-95, Twelfth International Conference on Machine Learning, pages 38–46, 1995.
J. de Bonet, C. Isbell, and P. Viola. Mimic: Finding optima by estimating probability densities. Advances in Neural Information Processing Systems, 9, 1996.
S. Baluja and S. Davies. Using optimal dependency-trees for combinatorial optimization: Learning the structure of the search space. In D Fisher, editor, Proceedings of the Fourteenth International Conference on Machine Learning (ICML-97), pages 30–38, 1997.
M. Pelikan, D. Goldberg, and E. Cantú-Paz. Boa: The Bayesian optimization algorithm. Technical Report 99003, University of Illinois at Urbana-Champaign, Illinois Genetic Algorithms Laboratory, 1999.
P. Bosman and D. Thierens. Linkage information processing in distribution estimation algorithms. In W. Banzhaf, J. Daida, A.E. Eiben, M.H. Garzon, V. Honavar, M. Jakiela, and R.E. Smith, editors, Proceedings of the GECCO-99 Genetic and EvolutionaryComputation Conference, pages 60–67. Morgan Kaufmann, July 1999.
J. Schwarz and J. Ocenasek. Experimental study: hypergraph partitioning based on the simple and advanced algorithms BMDA and BOA. In Proceedings of the fifth international conference on Soft Computing, pages 124–130, 1999.
G. Syswerda. A study of reproduction in generational and steady state genetic algorithms. In G. Rawlins, editor, Foundations of Genetic Algorithms, pages 94–101, 1991.
H. Chen and S. Ma. Low-temperature behaviour of a one-dimensional random Ising model. Journal of Statistical Physics, 29:717, 1982.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-VerlagBerlin Heidelberg
About this paper
Cite this paper
Johnson, A., Shapiro, J. (2002). The Importance of Selection Mechanisms in Distribution Estimation Algorithms. In: Collet, P., Fonlupt, C., Hao, JK., Lutton, E., Schoenauer, M. (eds) Artificial Evolution. EA 2001. Lecture Notes in Computer Science, vol 2310. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46033-0_8
Download citation
DOI: https://doi.org/10.1007/3-540-46033-0_8
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43544-0
Online ISBN: 978-3-540-46033-6
eBook Packages: Springer Book Archive