Abstract
An emerging class of neurosymbolic methods relies on the use of neural networks to determine the parameters of symbolic probabilistic models. To train these hybrid models, these methods use a knowledge compiler to turn the symbolic model into a differentiable arithmetic circuit, after which gradient descent can be performed. However, these methods require compiling a reasonably sized circuit, which is not always possible, as for many symbolic probabilistic models calculating a gradient towards the parameters is \(\#P\)-hard. We introduce a new approach for learning parameters using partially compiled circuits with approximation nodes. We show that, if the errors made in the approximation nodes are bounded, the error on the gradient of partially compiled circuits can also be bounded. We evaluate the impact of various approximation guarantees on this approach’s learning and generalization performance. Using approximation allows more complex queries to be compiled and our experiments show that their addition helps reduce the training loss. However, we observe that there is a limit to the addition of partial circuits after which there is no more improvement.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
We dropped the \(\mathcal{W}\) and q notation for conciseness.
- 2.
Proof available in Appendix A of https://hdl.handle.net/2078.1/288827.
- 3.
The modifications are available on Schlandals GitHub https://github.com/aia-uclouvain/schlandals.
- 4.
Method detailed in Appendix B of https://hdl.handle.net/2078.1/288827.
- 5.
The observed behaviors also apply to the other data sets, but we do not include them for conciseness reasons.
References
Chakraborty, S., Fremont, D., Meel, K., Seshia, S., Vardi, M.: Distribution-aware sampling and weighted model counting for SAT. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 28 (2014)
Chakraborty, S., Meel, K.S., Vardi, M.Y.: Algorithmic improvements in approximate counting for probabilistic inference: From linear to logarithmic SAT calls. Tech. rep. (2016)
Chavira, M., Darwiche, A.: On probabilistic inference by weighted model counting. Artifi. Intell. 172(6-7) (2008)
Darwiche, A.: A differential approach to inference in bayesian networks. J. ACM (JACM) 50(3), 280–305 (2003)
Darwiche, A.: SDD: A new canonical representation of propositional knowledge bases. In: Twenty-Second International Joint Conference on Artificial Intelligence (2011)
De Raedt, L., Kimmig, A., Toivonen, H.: Problog: A probabilistic prolog and its application in link discovery. In: IJCAI 2007, Proceedings of the 20th International Joint Conference on Artificial Intelligence, pp. 2462–2467. IJCAI-INT Joint Conf Artif Intell (2007)
Dubray, A., Schaus, P., Nijssen, S.: Probabilistic inference by projected weighted model counting on horn clauses. In: 29th International Conference on Principles and Practice of Constraint Programming (CP 2023) (2023)
Dubray, A., Schaus, P., Nijssen, S.: Anytime weighted model counting with approximation guarantees for probabilistic inference. In: 30th International Conference on Principles and Practice of Constraint Programming (CP 2024) (2024)
Fierens, D., et al.: Inference and learning in probabilistic logic programs using weighted Boolean formulas. Theory Pract. Logic Programm. 15(3) (2015)
Gomes, C.P., Hoffmann, J., Sabharwal, A., Selman, B.: From Sampling to Model Counting. In: IJCAI, vol. 2007 (2007)
Huang, J., Darwiche, A.: The language of search. J. Artifi. Intell. Res. 29 (2007)
Kisa, D., Van den Broeck, G., Choi, A., Darwiche, A.: Probabilistic sentential decision diagrams. In: Fourteenth International Conference on the Principles of Knowledge Representation and Reasoning (2014)
Klise, K.A., Bynum, M., Moriarty, D., Murray, R.: A software framework for assessing the resilience of drinking water systems to disasters with an example earthquake case study. Environm. Model. Softw. 95 (2017)
Lai, Y., Meel, K.S., Yap, R.H.: Fast converging anytime model counting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 37 (2023)
Latour, A.L., Babaki, B., Fokkinga, D., Anastacio, M., Hoos, H.H., Nijssen, S.: Exact stochastic constraint optimisation with applications in network analysis. Artif. Intell. 304, 103650 (2022)
Manhaeve, R., Dumancic, S., Kimmig, A., Demeester, T., De Raedt, L.: Deepproblog: neural probabilistic logic programming. Adv. Neural Inform. Process. Syst. 31 (2018)
Manhaeve, R., Marra, G., De Raedt, L.: Approximate inference for neural probabilistic logic programming. In: Proceedings of the 18th International Conference on Principles of Knowledge Representation and Reasoning, pp. 475–486. IJCAI Organization (2021)
Medjroubi, W., Müller, U.P., Scharf, M., Matke, C., Kleinhans, D.: Open data in power grid modelling: new approaches towards transparent grid models. Energy Rep. 3 (2017)
Peharz, R., et al.: Einsum networks: Fast and scalable learning of tractable probabilistic circuits. In: International Conference on Machine Learning. PMLR (2020)
Scutari, M.: Learning bayesian networks with the bnlearn r package. arXiv preprint arXiv:0908.3817 (2009)
Shafer, G.R., Shenoy, P.P.: Probability propagation. Ann. Math. Artif. Intell. 2, 327–351 (1990)
Soos, M., Gocht, S., Meel, K.S.: Tinted, Detached, and Lazy CNF-XOR solving and its Applications to Counting and Sampling
Soos, M., Meel, K.S.: BIRD: engineering an efficient CNF-XOR SAT solver and its applications to approximate model counting. In: Proceedings of the AAAI Conference on Artificial Intelligence. vol. 33 (2019)
Wiegmans, B.: Gridkit: European And North-American Extracts (Mar 2016). https://doi.org/10.5281/ZENODO.47317
Acknowledgments
This work was supported by Service Public de Wallonie Recherche under grant \(\hbox {n}^{\circ }\)2010235 - ARIAC by DIGITALWALLONIA4.AI.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Dierckx, L., Dubray, A., Nijssen, S. (2024). Parameter Learning Using Approximate Model Counting. In: Besold, T.R., d’Avila Garcez, A., Jimenez-Ruiz, E., Confalonieri, R., Madhyastha, P., Wagner, B. (eds) Neural-Symbolic Learning and Reasoning. NeSy 2024. Lecture Notes in Computer Science(), vol 14980. Springer, Cham. https://doi.org/10.1007/978-3-031-71170-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-031-71170-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-71169-5
Online ISBN: 978-3-031-71170-1
eBook Packages: Computer ScienceComputer Science (R0)