Skip to main content

Parameter Learning Using Approximate Model Counting

  • Conference paper
  • First Online:
Neural-Symbolic Learning and Reasoning (NeSy 2024)

Abstract

An emerging class of neurosymbolic methods relies on the use of neural networks to determine the parameters of symbolic probabilistic models. To train these hybrid models, these methods use a knowledge compiler to turn the symbolic model into a differentiable arithmetic circuit, after which gradient descent can be performed. However, these methods require compiling a reasonably sized circuit, which is not always possible, as for many symbolic probabilistic models calculating a gradient towards the parameters is \(\#P\)-hard. We introduce a new approach for learning parameters using partially compiled circuits with approximation nodes. We show that, if the errors made in the approximation nodes are bounded, the error on the gradient of partially compiled circuits can also be bounded. We evaluate the impact of various approximation guarantees on this approach’s learning and generalization performance. Using approximation allows more complex queries to be compiled and our experiments show that their addition helps reduce the training loss. However, we observe that there is a limit to the addition of partial circuits after which there is no more improvement.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    We dropped the \(\mathcal{W}\) and q notation for conciseness.

  2. 2.

    Proof available in Appendix A of https://hdl.handle.net/2078.1/288827.

  3. 3.

    The modifications are available on Schlandals GitHub https://github.com/aia-uclouvain/schlandals.

  4. 4.

    Method detailed in Appendix B of https://hdl.handle.net/2078.1/288827.

  5. 5.

    The observed behaviors also apply to the other data sets, but we do not include them for conciseness reasons.

References

  1. Chakraborty, S., Fremont, D., Meel, K., Seshia, S., Vardi, M.: Distribution-aware sampling and weighted model counting for SAT. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)

    Google Scholar 

  2. Chakraborty, S., Meel, K.S., Vardi, M.Y.: Algorithmic improvements in approximate counting for probabilistic inference: From linear to logarithmic SAT calls. Tech. rep. (2016)

    Google Scholar 

  3. Chavira, M., Darwiche, A.: On probabilistic inference by weighted model counting. Artifi. Intell. 172(6-7) (2008)

    Google Scholar 

  4. Darwiche, A.: A differential approach to inference in bayesian networks. J. ACM (JACM) 50(3), 280–305 (2003)

    Article  MathSciNet  Google Scholar 

  5. Darwiche, A.: SDD: A new canonical representation of propositional knowledge bases. In: Twenty-Second International Joint Conference on Artificial Intelligence (2011)

    Google Scholar 

  6. De Raedt, L., Kimmig, A., Toivonen, H.: Problog: A probabilistic prolog and its application in link discovery. In: IJCAI 2007, Proceedings of the 20th International Joint Conference on Artificial Intelligence, pp. 2462–2467. IJCAI-INT Joint Conf Artif Intell (2007)

    Google Scholar 

  7. Dubray, A., Schaus, P., Nijssen, S.: Probabilistic inference by projected weighted model counting on horn clauses. In: 29th International Conference on Principles and Practice of Constraint Programming (CP 2023) (2023)

    Google Scholar 

  8. Dubray, A., Schaus, P., Nijssen, S.: Anytime weighted model counting with approximation guarantees for probabilistic inference. In: 30th International Conference on Principles and Practice of Constraint Programming (CP 2024) (2024)

    Google Scholar 

  9. Fierens, D., et al.: Inference and learning in probabilistic logic programs using weighted Boolean formulas. Theory Pract. Logic Programm. 15(3) (2015)

    Google Scholar 

  10. Gomes, C.P., Hoffmann, J., Sabharwal, A., Selman, B.: From Sampling to Model Counting. In: IJCAI, vol. 2007 (2007)

    Google Scholar 

  11. Huang, J., Darwiche, A.: The language of search. J. Artifi. Intell. Res. 29 (2007)

    Google Scholar 

  12. Kisa, D., Van den Broeck, G., Choi, A., Darwiche, A.: Probabilistic sentential decision diagrams. In: Fourteenth International Conference on the Principles of Knowledge Representation and Reasoning (2014)

    Google Scholar 

  13. Klise, K.A., Bynum, M., Moriarty, D., Murray, R.: A software framework for assessing the resilience of drinking water systems to disasters with an example earthquake case study. Environm. Model. Softw. 95 (2017)

    Google Scholar 

  14. Lai, Y., Meel, K.S., Yap, R.H.: Fast converging anytime model counting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37 (2023)

    Google Scholar 

  15. Latour, A.L., Babaki, B., Fokkinga, D., Anastacio, M., Hoos, H.H., Nijssen, S.: Exact stochastic constraint optimisation with applications in network analysis. Artif. Intell. 304, 103650 (2022)

    Article  MathSciNet  Google Scholar 

  16. Manhaeve, R., Dumancic, S., Kimmig, A., Demeester, T., De Raedt, L.: Deepproblog: neural probabilistic logic programming. Adv. Neural Inform. Process. Syst. 31 (2018)

    Google Scholar 

  17. Manhaeve, R., Marra, G., De Raedt, L.: Approximate inference for neural probabilistic logic programming. In: Proceedings of the 18th International Conference on Principles of Knowledge Representation and Reasoning, pp. 475–486. IJCAI Organization (2021)

    Google Scholar 

  18. Medjroubi, W., Müller, U.P., Scharf, M., Matke, C., Kleinhans, D.: Open data in power grid modelling: new approaches towards transparent grid models. Energy Rep. 3 (2017)

    Google Scholar 

  19. Peharz, R., et al.: Einsum networks: Fast and scalable learning of tractable probabilistic circuits. In: International Conference on Machine Learning. PMLR (2020)

    Google Scholar 

  20. Scutari, M.: Learning bayesian networks with the bnlearn r package. arXiv preprint arXiv:0908.3817 (2009)

  21. Shafer, G.R., Shenoy, P.P.: Probability propagation. Ann. Math. Artif. Intell. 2, 327–351 (1990)

    Article  MathSciNet  Google Scholar 

  22. Soos, M., Gocht, S., Meel, K.S.: Tinted, Detached, and Lazy CNF-XOR solving and its Applications to Counting and Sampling

    Google Scholar 

  23. Soos, M., Meel, K.S.: BIRD: engineering an efficient CNF-XOR SAT solver and its applications to approximate model counting. In: Proceedings of the AAAI Conference on Artificial Intelligence. vol. 33 (2019)

    Google Scholar 

  24. Wiegmans, B.: Gridkit: European And North-American Extracts (Mar 2016). https://doi.org/10.5281/ZENODO.47317

Download references

Acknowledgments

This work was supported by Service Public de Wallonie Recherche under grant \(\hbox {n}^{\circ }\)2010235 - ARIAC by DIGITALWALLONIA4.AI.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lucile Dierckx .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dierckx, L., Dubray, A., Nijssen, S. (2024). Parameter Learning Using Approximate Model Counting. In: Besold, T.R., d’Avila Garcez, A., Jimenez-Ruiz, E., Confalonieri, R., Madhyastha, P., Wagner, B. (eds) Neural-Symbolic Learning and Reasoning. NeSy 2024. Lecture Notes in Computer Science(), vol 14980. Springer, Cham. https://doi.org/10.1007/978-3-031-71170-1_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-71170-1_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-71169-5

  • Online ISBN: 978-3-031-71170-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics