Skip to main content

Fusing Probabilistic Information on Maximum Entropy

  • Conference paper
KI 2003: Advances in Artificial Intelligence (KI 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2821))

Included in the following conference series:

  • 543 Accesses

Abstract

We present a method to fuse pieces of probabilistic information stemming from different sources which is based on information theoretical optimization techniques. We use the well-known principle of maximum entropy to process information most faithfully, while interactions between the different knowledge bases are precluded. The so-defined fusion operator satisfies basic demands, such as commutativity and the Pareto principle. A detailed analysis shows it to merge the corresponding epistemic states. Furthermore, it induces a numerical fusion operator that computes the information theoretical mean of probabilities.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alchourrón, C., Gärdenfors, P., Makinson, P.: On the logic of theory change: Partial meet contraction and revision functions. Journal of Symbolic Logic 50, 510–530 (1985)

    Article  MATH  MathSciNet  Google Scholar 

  2. Bloch, I., Hunter, A., et al.: Fusion: General concepts and characteristics. International Journal of Intelligent Systems 16, 1107–1134 (2001)

    Article  Google Scholar 

  3. Katsuno, H., Mendelzon, A.: Propositional knowledge base revision and minimal change. Artificial Intelligence 52, 263–294 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  4. Konieczny, S., Pino-Perez, R.: On the logic of merging. In: Proceedings Sixth International Conference on Principles of Knowledge Representation and Reasoning, KR 1998, pp. 488–498 (1998)

    Google Scholar 

  5. Maynard-Reid II, P., Lehmann, D.: Representing and aggregating conflicting beliefs. In: Proceedings Seventh International Conference on Principles of Knowledge Representation and Reasoning, KR 2000 (2000)

    Google Scholar 

  6. Dubois, D., Lang, J., Prade, H.: Dealing with multi-source information in possibilistic logic. In: Proceedings ECAI 1992, pp. 38–42 (1992)

    Google Scholar 

  7. Schramm, M., Ertel, W.: (PIT), www.pit-systems.de

  8. Rödder, W., Meyer, C.H.: (SPIRIT), www.fernuni-hagen.de/BWLOR/forsch.html

  9. Rödder, W., Meyer, C.H.: Coherent knowledge processing at maximum entropy by SPIRIT. In: Horvitz, E., Jensen, F. (eds.) Proceedings 12th Conference on Uncertainty in Artificial Intelligence, San Francisco, Ca., pp. 470–476. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  10. Schramm, M., Ertel, W.: Reasoning with probabilities and maximum entropy: the system PIT and its application in LEXMED. In: Symposium on Operations Research, SOR 1999 (1999)

    Google Scholar 

  11. Cowell, R., Dawid, A., Lauritzen, S., Spiegelhalter, D.: Probabilistic networks and expert systems. Springer, Heidelberg (1999)

    MATH  Google Scholar 

  12. Jaynes, E.: Where do we stand on maximum entropy? In: Papers on Probability, Statistics and Statistical Physics, pp. 210–314. D. Reidel Publishing Company, Dordrecht (1983)

    Google Scholar 

  13. Shore, J., Johnson, R.: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory IT-26, 26–37 (1980)

    Article  MathSciNet  Google Scholar 

  14. Paris, J., Vencovská, A.: A note on the inevitability of maximum entropy. International Journal of Approximate Reasoning 14, 183–223 (1990)

    Article  Google Scholar 

  15. Kern-Isberner, G.: Characterizing the principle of minimum cross-entropy within a conditional-logical framework. Artificial Intelligence 98, 169–208 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  16. Kern-Isberner, G.: The principle of conditional preservation in belief revision. In: Eiter, T., Schewe, K.-D. (eds.) FoIKS 2002. LNCS, vol. 2284, pp. 105–129. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  17. Kern-Isberner, G.: Handling conditionals adequately in uncertain reasoning and belief revision. Journal of Applied Non-Classical Logics 12, 215–237 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  18. Rödder, W., Kern-Isberner, G.: From information to probability: an axiomatic approach. International Journal of Intelligent Systems 18, 383–403 (2003)

    Article  MATH  Google Scholar 

  19. Rödder, W., Xu, L.: Entropy-driven inference and inconsistency. In: Proceedings Artificial Intelligence and Statistics, Fort Lauderdale, Florida, pp. 272–277 (1999)

    Google Scholar 

  20. Csiszár, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Prob. 3, 146–158 (1975)

    Article  MATH  Google Scholar 

  21. Paris, J.: The uncertain reasoner’s companion – A mathematical perspective. Cambridge University Press, Cambridge (1994)

    MATH  Google Scholar 

  22. Kern-Isberner, G.: Conditionals in nonmonotonic reasoning and belief revision. In: Kern-Isberner, G. (ed.) Conditionals in Nonmonotonic Reasoning and Belief Revision. LNCS (LNAI), vol. 2087. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  23. Sen, A.: Social choice theory. In: Arrow, K., Intriligator, M. (eds.) Handbook of Mathematical Economics, vol. III, pp. 1073–1181. Elsevier Science Publishers, Amsterdam (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kern-Isberner, G., Rödder, W. (2003). Fusing Probabilistic Information on Maximum Entropy. In: Günter, A., Kruse, R., Neumann, B. (eds) KI 2003: Advances in Artificial Intelligence. KI 2003. Lecture Notes in Computer Science(), vol 2821. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-39451-8_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-39451-8_30

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-20059-8

  • Online ISBN: 978-3-540-39451-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics