Skip to main content

Towards a More Efficient Computation of Weighted Conditional Impacts for Relational Probabilistic Knowledge Bases Under Maximum Entropy Semantics

  • Conference paper
  • First Online:
KI 2015: Advances in Artificial Intelligence (KI 2015)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9324))

Abstract

While the complexity of the optimization problem to be solved when computing the Maximum Entropy distribution \(P^{*}_{\mathcal {R}}\) of a knowledge base \(\mathcal {R}\) grows dramatically when moving to the relational case, it has been shown that having the weighted conditional impacts (WCI) of \(\mathcal {R}\) available, \(P^{*}_{\mathcal {R}}\) can be computed much faster. Computing WCI in a straightforward manner readily gets infeasible due to the size of the set \(\varOmega \) of possible worlds. In this paper, we propose a new approach for computing the WCI without considering the worlds in \(\varOmega \) at all. We introduce the notion of sat-pairs and show how to determine the set \(\mathcal {CSP}\) of all possible combinations of sat-pairs by employing combinatorial means. Using \(\mathcal {CSP}\) instead of \(\varOmega \) for computing the WCI is a significant performance gain since \(\mathcal {CSP}\) is typically much smaller than \(\varOmega \). For a start, we focus on simple knowledge bases consisting of a single conditional. First evaluation results of an implemented algorithm illustrate the benefits of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brualdi, R.A.: Introductory Combinatorics (5th Edition). Pearson (2009)

    Google Scholar 

  2. Darroch, J.N., Ratcliff, D.: Generalized iterative scaling for log-linear models. Annals of Mathematical Statistics 43(5), 1470–1480 (1972)

    Article  MathSciNet  MATH  Google Scholar 

  3. de Salvo Braz, R., Amir, E., Roth, D.: Lifted first-order probabilistic inference. In: IJCAI-05, pp. 1319–1325. Professional Book Center (2005)

    Google Scholar 

  4. Finthammer, M., Beierle, C.: Using equivalences of worlds for aggregation semantics of relational conditionals. In: Glimm, B., Krüger, A. (eds.) KI 2012. LNCS, vol. 7526, pp. 49–60. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  5. Finthammer, M., Beierle, C.: A two-level approach to maximum entropy model computation for relational probabilistic logic based on weighted conditional impacts. In: Straccia, U., Calì, A. (eds.) SUM 2014. LNCS, vol. 8720, pp. 162–175. Springer, Heidelberg (2014)

    Google Scholar 

  6. Fischer, V., Schramm, M.: Tabl - a tool for efficient compilation of probabilistic constraints. Technical Report TUM-19636, Technische Universität München (1996)

    Google Scholar 

  7. Getoor, L., Taskar, B. (eds.): Introduction to Statistical Relational Learning. MIT Press (2007)

    Google Scholar 

  8. Halpern, J.: Reasoning About Uncertainty. MIT Press (2005)

    Google Scholar 

  9. Kazemi, S.M., Buchman, D., Kersting, K., Natarajan, S., Poole, D.: Relational logistic regression. In: Proc. 14th International Conference on Principles of Knowledge Representation and Reasoning (KR-2014) (2014)

    Google Scholar 

  10. Kern-Isberner, G.: Conditionals in Nonmonotonic Reasoning and Belief Revision. LNCS, vol. 2087. Springer, Heidelberg (2001)

    Google Scholar 

  11. Kern-Isberner, G., Beierle, C., Finthammer, M., Thimm, M.: Comparing and evaluating approaches to probabilistic reasoning: Theory, implementation, and applications. Transactions on Large-Scale Data- and Knowledge-Centered Systems 6, 31–75 (2012)

    Google Scholar 

  12. Kern-Isberner, G., Lukasiewicz, T.: Combining probabilistic logic programming with the power of maximum entropy. Artif. Intell. 157(1–2), 139–202 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  13. Kern-Isberner, G., Thimm, M.: Novel semantical approaches to relational probabilistic conditionals. In: Proc. KR-2010, pp. 382–391. AAAI Press, Menlo Park, CA (2010)

    Google Scholar 

  14. Kern-Isberner, G., Thimm, M.: A ranking semantics for first-order conditionals. In: ECAI-2012, pp. 456–461. IOS Press (2012)

    Google Scholar 

  15. Lukasiewicz, T.: Probabilistic logic programming with conditional constraints. ACM Trans. Comput. Logic 2(3), 289–339 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  16. Milch, B., Zettlemoyer, L., Kersting, K., Haimes, M., Kaelbling, L.P.: Lifted probabilistic inference with counting formulas. In: AAAI-2008, pp. 1062–1068. AAAI Press (2008)

    Google Scholar 

  17. Nilsson, N.: Probabilistic logic. Artificial Intelligence 28, 71–87 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  18. Paris, J.: The uncertain reasoner’s companion - A mathematical perspective. University Press, Cambridge (1994)

    MATH  Google Scholar 

  19. Pearl, J.: Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo, Ca (1988)

    MATH  Google Scholar 

  20. Poole, D.: First-order probabilistic inference. In: Gottlob, G., Walsh, T. (ed.), Proceedings of the Eighteenth International Joint Conference on Artificial Intelligence (IJCAI-03), pp. 985–991. Morgan Kaufmann (2003)

    Google Scholar 

  21. Potyka, N., Beierle, C., Kern-Isberner, G.: A concept for the evolution of relational probabilistic belief states and the computation of their changes under optimum entropy semantics. Journal of Applied Logic (2015). (to appear)

    Google Scholar 

  22. Richardson, M., Domingos, P.: Markov logic networks. Machine Learning 62(1–2), 107–136 (2006)

    Article  Google Scholar 

  23. Shore, J., Johnson, R.: Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory, IT-26:26–37 (1980)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marc Finthammer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Finthammer, M., Beierle, C. (2015). Towards a More Efficient Computation of Weighted Conditional Impacts for Relational Probabilistic Knowledge Bases Under Maximum Entropy Semantics. In: Hölldobler, S., , Peñaloza, R., Rudolph, S. (eds) KI 2015: Advances in Artificial Intelligence. KI 2015. Lecture Notes in Computer Science(), vol 9324. Springer, Cham. https://doi.org/10.1007/978-3-319-24489-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-24489-1_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-24488-4

  • Online ISBN: 978-3-319-24489-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics