Skip to main content

Linking Entropy to Estimation of Distribution Algorithms

  • Chapter
Book cover Towards a New Evolutionary Computation

Part of the book series: Studies in Fuzziness and Soft Computing ((STUDFUZZ,volume 192))

Summary

This chapter presents results on the application of the concept of entropy to estimation of distribution algorithms (EDAs). Firstly, the Boltzmann mutual information curves are introduced. They are shown to contain a lot of information about the difficulty of the functions. Next, a design method of discrete benchmark functions is presented. The newly developed approach allows the construction of both single and random classes of functions that obey a given collection of probabilistic constraints. This application and the next — the construction of low cost search distributions — are based on the principle of maximum entropy. The last proposal is the linear entropic mutation (LEM), an approach that measures the amount of mutation applied to a variable as the increase of its entropy. We argue that LEM is a natural operator for EDAs because it mutates distributions instead of single individuals.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. A. Agresti. Categorical Data Analysis. John Wiley and Sons, 1990.

    Google Scholar 

  2. I. Beinlich, H. R. Chavez, and G. Cooper. The ALARM monitoring system: A case study with two probabilistic inference techniques for belief networks. In Artificial Intelligence in Medical Care, pp. 247–256, 1989.

    Google Scholar 

  3. T. M. Cover and J. A. Thomas. Elements of Information Theory. Jonh Wiley and Sons, New York, 1991.

    Book  MATH  Google Scholar 

  4. I. Csiszar. I-Divergence geometry of probability distributions and minimization problems. Annals of Probability, 3:146–158, 1975.

    Article  MATH  MathSciNet  Google Scholar 

  5. L. M. de Campos. Independency relationship and learning algorithms for singly connected networks. Experimental and Theoretical Artificial Intelligence, 10:511–549, 1998.

    Article  MATH  Google Scholar 

  6. K. Deb, J. Horn, and D. E. Goldberg. Multimodal deceptive function. Complex Systems, 7:131–153, 1993.

    MATH  Google Scholar 

  7. D. E. Goldberg. Simple genetic algorithms and the minimal deceptive problem. In Lawerance Davis, editor, Genetic Algorithms and Simulated Annealing, pp. 74–88. Pitman, 1987.

    Google Scholar 

  8. D. E. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, Reading, MA, 1989.

    MATH  Google Scholar 

  9. D. E. Goldberg, K. Deb, and J. Horns. Massive multimodality, deception, and genetic algorithms. Lecture Notes in Computer Sciences, Parallel Problem Solving from Nature PPSN II, pp. 37–46, 1992.

    Google Scholar 

  10. C. Huang and A. Darwiche. Inference in belief networks: A procedural guide. Journal of Approximate Reasoning, 15(3):225–263, 1996.

    Article  MATH  MathSciNet  Google Scholar 

  11. C. T. Ireland and S. Kullback. Contingency tables with given marginals. Biometrika, 55:179–188, 1968.

    Article  MATH  MathSciNet  Google Scholar 

  12. E. T. Jaynes. Information theory and statistical mechanics. Physics Review, 6:620–643, 1957.

    Article  MathSciNet  Google Scholar 

  13. E. T. Jaynes. Where do we stand on maximum entropy? In R. D. Levine and M. Tribus, editors, The Maximum Entropy Formalism. MIT Press, 1978.

    Google Scholar 

  14. F.V. Jensen and F. Jensen. Optimal junction trees. In 10th Conference on Uncertainty in Artificial Intelligence, pp. 360–366, Seattle, 1994.

    Google Scholar 

  15. R. Jiroušek and S. Přeučil. On the effective implementation of the iterative proportional fitting procedure. Computational Statistics and Data Analysis, 19:177–189, 1995.

    Article  MATH  Google Scholar 

  16. S. L. Lauritzen. Graphical Models. Oxford Press, 1996.

    Google Scholar 

  17. F. Leisch, A. Weingessel, and K. Hornik. On the Generation of Correlated Artificial Binary Data. Technical Report 13, Viena University of Economics and Bussines Administration, Viena, 1998.

    Google Scholar 

  18. P. M. Lewis. Approximating probability distributions to reduce storage requirements. Information and Control, 2:214–225, 1959.

    Article  MATH  MathSciNet  Google Scholar 

  19. T. Mahnig and H. Mühlenbein. Comparing the adaptive Boltzmann selection schedule SDS to truncation selection. In Third International Symposium on Adaptive Systems ISAS 2001, Evolutionary Computation and Probabilistic Graphical Models, pp. 121–128, La Habana, 2001.

    Google Scholar 

  20. T. Mahnig and H. Mühlenbein. Optimal mutation rate using Bayesian priors for estimation of distribution algorithms. Lecture Notes in Computer Sciences, 2264:33–48, 2001.

    Article  Google Scholar 

  21. C. H. Meyer. Korrektes Schliesen bei Unvollständiger Information. PhD thesis, Fernuniversität Hagen, 1998. In German.

    Google Scholar 

  22. H. Mühlenbein. The equation for the response to selection and its use for prediction. Evolutionary Computation, 5(3):303–346, 1998.

    Article  Google Scholar 

  23. H. Mühlenbein and R. Höns. The estimation of distributions and the maximum entropy principle. Evolutionary Computation, 2004. To appear.

    Google Scholar 

  24. H. Mühlenbein and T. Mahnig. Evolutionary optimization and the estimation of search distributions. Journal of Approximate Reasoning, 31(3):157–192, 2002.

    Article  MATH  MathSciNet  Google Scholar 

  25. H. Mühlenbein, T. Mahnig, and A. Ochoa. Schemata, distributions and graphical models in evolutionary optimization. Journal of Heuristics, 5(2):213–247, 1999.

    Article  Google Scholar 

  26. H. Mühlenbein and G. Paas. From recombination of genes to the estimation of distributions I. Binary parameters. Lecture Notes in Computer Sciences, Parallel Problem Solving from Nature PPSN IV, 1141:178–187, 1996.

    Article  Google Scholar 

  27. D. Nilsson. An efficient algorithm for finding the M most probable configuration in Bayesian networks. Statistics and Computing, 2:159–173, 1998.

    Article  Google Scholar 

  28. D. Nilsson. The computation of moments of decomposable functions in probabilistics expert systems. In Third International Symposium on Adaptive Systems ISAS 2001, Evolutionary Computation and Probabilistic Graphical Models, pp. 116–120, La Habana, 2001.

    Google Scholar 

  29. A. Ochoa, R. Höns, M. Soto, and H. Müehlenbein. A maximum entropy approach to sampling in EDA — the single connected case. Lecture Notes in Computer Sciences, 8th Iberoamerican Congress on Pattern Recognition CIARP 2003, 2905:683–690, 2003.

    Google Scholar 

  30. J. Pearl. Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, 1988.

    Google Scholar 

  31. J. Pearl. Causality: Models, Reasoning and Inference. Cambridge University Press, 2000.

    Google Scholar 

  32. C. E. Shannon. A mathematical theory of communication. Bell System Technical Journal, 27:379–423, 1948.

    MATH  MathSciNet  Google Scholar 

  33. M. Soto. Un Estudio sobre los Algoritmos Evolutivos Basados en Redes Bayesianas Simplemente Conectadas y su Costo de Evaluación. PhD thesis, Instituto de Cibernética, Matemática y Física, La Habana, 2003. In Spanish.

    Google Scholar 

  34. M. Soto and A. Ochoa. A factorized distribution algorithm based on polytrees. In Congress on Evolutionary Computation CEC 2000, pp. 232–237, California, 2000.

    Google Scholar 

  35. J. Whittaker. Graphical Models in Applied Multivariate Statistics. John Wiley and Sons, 1989.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Ochoa, A., Soto, M. (2006). Linking Entropy to Estimation of Distribution Algorithms. In: Lozano, J.A., Larrañaga, P., Inza, I., Bengoetxea, E. (eds) Towards a New Evolutionary Computation. Studies in Fuzziness and Soft Computing, vol 192. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-32494-1_1

Download citation

  • DOI: https://doi.org/10.1007/3-540-32494-1_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29006-3

  • Online ISBN: 978-3-540-32494-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics