Skip to main content

A conditional-logical approach to minimum cross-entropy

  • Logic and Learning
  • Conference paper
  • First Online:
  • 109 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1200))

Abstract

The principle of minimum cross-entropy (ME-principle) is often used in the AI-areas of knowledge representation and uncertain reasoning as an elegant and powerful tool to build up complete probability distributions when only partial knowledge is available. The inputs it may be applied to are a prior distribution P and some new information R, and it yields as a result the one distribution P * that satisfies R and is closest to P in an information-theoretic sense. More generally, it provides a ”best” solution to the problem ”How to adjust P to R?”

In this paper, we show in a rather direct and constructive manner that adjusting P to R by means of this principle follows a simple and intelligible conditional-logical pattern. The scheme that underlies ME-adjustment is made obvious, and in a generalized form, it provides a straightforward conditional-logical approach to the adaptation problem. We introduce the idea of a functional concept and show how the demands for logical consistency and representation invariance influence the functions involved. Finally, the ME-distribution arises as the only solution which follows the simple adaptation scheme given and satisfies these three assumptions. So a characterization of the ME-principle within a conditional-logical framework will have been achieved, and its logical mechanisms will be revealed clearly.

This is a preview of subscription content, log in via an institution.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. J. Aczél. Vorlesungen ueber Funktionalgleichungen und ihre Anwendungen. Birkhaeuser Verlag, Basel, 1961.

    Google Scholar 

  2. F. Bacchus. Representing and Reasoning with Probabilistic Knowledge: a Logical Approach to Probabilities. MIT Press, Cambridge, Mass., 1990.

    Google Scholar 

  3. P.G. Calabrese. Deduction and inference using conditional logic and probability. In I.R. Goodman, M.M. Gupta, H.T. Nguyen, and G.S. Rogers, editors, Conditional Logic in Expert Systems, pages 71–100. Elsevier, North Holland, 1991.

    Google Scholar 

  4. R.T. Cox. Probability, frequency and reasonable expectation. American Journal of Physics, 14(1):1–13, 1946.

    Article  Google Scholar 

  5. I. Csiszár. I-divergence geometry of probability distributions and minimization problems. Ann. Prob., 3:146–158, 1975.

    Google Scholar 

  6. M. Goldszmidt. Research issues in qualitative and abstract probability. AI Magazine, pages 63–66, Winter 1994.

    Google Scholar 

  7. E.T. Jaynes. Papers on Probability, Statistics and Statistical Physics. D. Reidel Publishing Company, Dordrecht, Holland, 1983.

    Google Scholar 

  8. R.W. Johnson and J.E. Shore. Comments on and correction to “Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy”. IEEE Transactions on Information Theory, IT-29(6):942–943, 1983.

    Article  Google Scholar 

  9. G. Kern-Isberner. Characterizing the principle of minimum cross-entropy within a conditional logical framework. Informatik Fachbericht 206, FernUniversitaet Hagen, 1996.

    Google Scholar 

  10. G. Kern-Isberner. Conditional logics and entropy. Informatik Fachbericht 203, FernUniversitaet Hagen, 1996.

    Google Scholar 

  11. S.L. Lauritzen and D.J. Spiegelhalter. Local computations with probabilities in graphical structures and their applications to expert systems. Journal of the Royal Statistical Society B, 50(2):415–448, 1988.

    Google Scholar 

  12. D. Nute. Topics in Conditional Logic. D. Reidel Publishing Company, Dordrecht, Holland, 1980.

    Google Scholar 

  13. J.B. Paris and A. Vencovská. A note on the inevitability of maximum entropy. International Journal of Approximate Reasoning, 14:183–223, 1990.

    Article  Google Scholar 

  14. J. Pearl. Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo, Ca., 1988.

    Google Scholar 

  15. W. Roedder and C.-H. Meyer. Coherent knowledge processing at maximum entropy by spirit. In Proceedings 12th Conference on Uncertainty in Artificial Intelligence, pages 470–476, 1996.

    Google Scholar 

  16. J.E. Shore. Relative entropy, probabilistic inference and AI. In L.N. Kanal and J.F. Lemmer, editors, Uncertainty in Artificial Intelligence, pages 211–215. North-Holland, Amsterdam, 1986.

    Google Scholar 

  17. J.E. Shore and R.W. Johnson. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory, IT-26:26–37, 1980.

    Article  Google Scholar 

  18. J.E. Shore and R.W. Johnson. Properties of cross-entropy minimization. IEEE Transactions on Information Theory, IT-27:472–482, 1981.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Rüdiger Reischuk Michel Morvan

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kern-Isberner, G. (1997). A conditional-logical approach to minimum cross-entropy. In: Reischuk, R., Morvan, M. (eds) STACS 97. STACS 1997. Lecture Notes in Computer Science, vol 1200. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0023463

Download citation

  • DOI: https://doi.org/10.1007/BFb0023463

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-62616-9

  • Online ISBN: 978-3-540-68342-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics