Skip to main content

Part of the book series: Studies in Computational Intelligence ((SCI,volume 500))

  • 689 Accesses

Abstract

Estimation of Distribution Algorithms (EDAs) is a high impact area in evolutionary computation and global optimization. One of the main EDAs strengths is the explicit codification of variable dependencies. The search engine is a joint probability distribution (the search distribution), which is usually computed by fitting the best solutions in the current population. Even though using the best known solutions for biasing the search is a common rule in evolutionary computation, it is worth to notice that most evolutionary algorithms (EAs) derive the new population directly from the selected set, while EDAs do not. Hence, a different bias can be introduced for EDAs. In this article we introduce the so called Empirical Selection Distribution for biasing the search of an EDA based on a Bayesian Network. Bayesian networks based EDAs had shown impressive results for solving deceptive problems, by estimating the adequate structure (dependencies) and parameters (conditional probabilities) needed to tackle the optimum. In this work we show that a Bayesian Network based EDA (BN-EDA) can be enhanced by using the empirical selection distribution instead of the standard selection method. We introduce weighted estimators for the K2 metric which is capable of detecting better the variable correlations than the original BN-EDA, in addition, we introduce formulas to compute the conditional probabilities (local probability distributions). By providing evidence and performing statistical comparisons, we show that the enhanced version: 1) detects more true variable correlations, 2) has a greater probability of finding the optimum, and 3) requires less number of evaluations and/or population size than the original BN-EDA to reach the optimum. Our results suggest that the Empirical Selection Distribution provides to the algorithm more useful information than the usual selection step.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. The Factorized Distribution Algorithm for additively decomposed functions, vol. 1 (1999)

    Google Scholar 

  2. Baluja, S.: Population-based incremental learning. Tech. Rep. CMU-CS-94-163, Computer Science Department, Carnegie Mellon University, Pittsburgh, PA (June 1994)

    Google Scholar 

  3. Bonet, J.S.D., Isbell Jr., C.L., Viola, P.A.: MIMIC: Finding optima by estimating probability densities. In: Advances in Neural Information Processing Systems 9, NIPS, pp. 424–430. MIT Press (1996)

    Google Scholar 

  4. Efron, B.: The Jacknife, the Bootstrap and Other Resampling Plans. Society for Industrial and Applied Mathematics, 1400 Architect’s Building, 117 South 17th Street, Philadelphia, Pensilvania (1982)

    Google Scholar 

  5. Harik, G., Goldberg, D.E.: Learning linkage. In: Proceedings of the 4th Workshop on Foundations of Genetic Algorithms, pp. 247–262 (1996)

    Google Scholar 

  6. Harik, G.R., Lobo, F.G., Goldberg, D.E.: The compact genetic algorithm. IEEE Trans. Evolutionary Computation 3(4), 287–297 (1999)

    Article  Google Scholar 

  7. Hauschild, M.W., Pelikan, M., Sastry, K., Goldberg, D.E.: Using previous models to bias structural learning in the hierarchical boa. Evolutionary Computation 20(1), 135–160 (2012)

    Article  Google Scholar 

  8. Hauschild, M., Pelikan, M., Sastry, K., Lima, C.: Analyzing probabilistic models in hierarchical boa. Trans. Evol. Comp. 13(6), 1199–1217 (2009)

    Article  Google Scholar 

  9. Heckerman, D.: A tutorial on learning with Bayesian networks. Tech. Rep. MSR-TR-95-06, Microsoft Research, Advanced Technology Division, Microsoft Corporation (1995)

    Google Scholar 

  10. Heckerman, D., Geiger, D., Chickering, D.M.: Learning Bayesian Networks: The Combination of Knowledge and Statistical Data. Machine Learning 20(3), 197–243 (1995)

    MATH  Google Scholar 

  11. Lima, C., Lobo, F., Pelikan, M., Goldberg, D.: Model accuracy in the bayesian optimization algorithm. Soft Computing - A Fusion of Foundations, Methodologies and Applications 15, 1351–1371 (2011)

    Google Scholar 

  12. Lima, C.F., Pelikan, M., Goldberg, D.E., Lobo, F.G., Sastry, K., Hauschild, M.: Influence of selection and replacement strategies on linkage learning in boa. In: IEEE Congress on Evolutionary Computation, pp. 1083–1090 (2007)

    Google Scholar 

  13. Luong, H.N., Nguyen, H.T.T., Ahn, C.W.: Entropy-based efficiency enhancement techniques for evolutionary algorithms. Information Sciences 188, 100–120 (2012)

    Article  Google Scholar 

  14. Mühlenbein, H., Paaß, G.: From recombination of genes to the estimation of distributions I. Binary parameters. In: Voigt, H.M., Ebeling, W., Rechenberg, I., Schwefel, H.P. (eds.) PPSN 1996. LNCS, vol. 1141, pp. 178–187. Springer, Heidelberg (1996)

    Chapter  Google Scholar 

  15. Mühlenbein, H.: Convergence theorems of estimation of distribution algorithms. In: Shakya, S., Santana, R. (eds.) Markov Networks in Evolutionary Computation. ALO, vol. 14, pp. 91–108. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  16. Mühlenbein, H., Mahnig, T.: FDA -a scalable evolutionary algorithm for the optimization of additively decomposed functions. Evolutionary Computation 7(4), 353–376 (1999)

    Article  Google Scholar 

  17. Pelikan, M., Goldberg, D.E., Cantú-Paz, E.: BOA: The Bayesian Optimization Algorithm. In: Banzhaf, W., Daida, J., Eiben, A.E., Garzon, M.H., Honavar, V., Jakiela, M., Smith, R.E. (eds.) Proceedings of the Genetic and Evolutionary Computation Conference GECCO 1999, vol. I, pp. 525–532. Morgan Kaufmann Publishers, San Fransisco (1999)

    Google Scholar 

  18. Pelikan, M., Mühlenbein, H.: The Bivariate Marginal Distribution Algorithm. In: Advances in Soft Computing – Engineering Design and Manufacturing, pp. 521–535 (1999)

    Google Scholar 

  19. Pelikan, M., Sastry, K., Goldberg, D.E.: Scalability of the bayesian optimization algorithm. International Journal of Approximate Reasoning 31(3), 221–258 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  20. Pelikan, M., Sastry, K., Goldberg, D.E.: iBOA: the incremental bayesian optimization algorithm. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, GECCO 2008, pp. 455–462. ACM, New York (2008)

    Chapter  Google Scholar 

  21. Santana, R.: A Markov Network Based Factorized Distribution Algorithm for Optimization, pp. 337–348 (2003)

    Google Scholar 

  22. Shapiro, J.L.: Diversity loss in general estimation of distribution algorithms. In: Runarsson, T.P., Beyer, H.-G., Burke, E.K., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 92–101. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  23. Valdez-Peña, S.I., Hernández-Aguirre, A., Botello-Rionda, S.: Approximating the search distribution to the selection distribution in EDAs. In: Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, GECCO 2009, pp. 461–468. ACM, New York (2009)

    Google Scholar 

  24. Zhang, Q., Muhlenbein, H.: On the convergence of a class of estimation of distribution algorithms. Trans. Evol. Comp. 8(2), 127–136 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. Ivvan Valdez .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Valdez, S.I., Hernández, A., Botello, S. (2014). Effective Structure Learning in Bayesian Network Based EDAs. In: Schuetze, O., et al. EVOLVE - A Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation III. Studies in Computational Intelligence, vol 500. Springer, Heidelberg. https://doi.org/10.1007/978-3-319-01460-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-01460-9_1

  • Publisher Name: Springer, Heidelberg

  • Print ISBN: 978-3-319-01459-3

  • Online ISBN: 978-3-319-01460-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics