Abstract
Sum-Product Networks (SPNs) are recent deep probabilistic models providing exact and tractable inference. SPNs have been successfully employed as density estimators in several application domains. However, learning an SPN from high dimensional data still poses a challenge in terms of time complexity. This is due to the high cost of determining independencies among random variables (RVs) and sub-populations among samples, two operations that are repeated several times. Even one of the simplest greedy structure learner, LearnSPN, scales quadratically in the number of the variables to determine RVs independencies. In this work we investigate approximate but fast procedures to determine independencies among RVs whose complexity scales in sub-quadratic time. We propose two procedures: a random subspace approach and one that adopts entropy as a criterion to split RVs in linear time. Experimental results prove that LearnSPN equipped by our splitting procedures is able to reduce learning and/or inference times while preserving comparable inference accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Note that it is always possible to transform an SPN with adjacent nodes of the same type into an equivalent one with alternating types [24].
- 2.
We set k to be at least 2 when \(n = |\varvec{X}| < 4\).
- 3.
For a discrete RV X, having values in \(\mathcal {X}\), we consider its discrete entropy as \(H(X)=-\sum _{x\in \mathcal {X}}p(x)\log (p(x))\).
- 4.
For convenience, and to avoid the addition of a new hyperparameter, the Laplacian smoothing parameter value will be the same of the hyperparameter \(\alpha \) of LearnSPN, used to smooth the univariate distributions at leaves. Note that now \(\eta \) substitutes the hyperparameter \(\rho \) which is not needed anymore.
- 5.
Code is available at https://github.com/fabriziov/alt-vs-spyn.
References
Adel, T., Balduzzi, D., Ghodsi, A.: Learning the structure of sum-product networks via an SVD-based algorithm. In: UAI (2015)
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)
Cheng, W., Kok, S., Pham, H.V., Chieu, H.L., Chai, K.M.A.: Language modeling with sum-product networks. In: INTERSPEECH 2014, pp. 2098–2102 (2014)
Darwiche, A.: A differential approach to inference in Bayesian networks. JACM 50(3), 280–305 (2003)
Dennis, A., Ventura, D.: Learning the architecture of sum-product networks using clustering on variables. In: NIPS 25, pp. 2033–2041 (2012)
Di Mauro, N., Vergari, A., Basile, T.M.A.: Learning Bayesian random cutset forests. In: Esposito, F., Pivert, O., Hacid, M.-S., Raś, Z.W., Ferilli, S. (eds.) ISMIS 2015. LNCS (LNAI), vol. 9384, pp. 122–132. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-25252-0_13
Di Mauro, N., Vergari, A., Esposito, F.: Learning accurate cutset networks by exploiting decomposability. In: Gavanelli, M., Lamma, E., Riguzzi, F. (eds.) AI*IA 2015. LNCS, vol. 9336, pp. 221–232. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24309-2_17
Friesen, A., Domingos, P.: The sum-product theorem: a foundation for learning tractable models. In: ICML, pp. 1909–1918 (2016)
Gens, R., Domingos, P.: Learning the structure of sum-product networks. In: ICML, pp. 873–880 (2013)
Haaren, J.V., Davis, J.: Markov network structure learning: a randomized feature generation approach. In: AAAI. AAAI Press (2012)
Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press, Cambridge (2009)
Lowd, D., Davis, J.: Learning Markov network structure with decision trees. In: ICDM, pp. 334–343. IEEE Computer Society Press (2010)
MacKay, D.J.C.: Information Theory, Inference & Learning Algorithms. Cambridge University Press, New York (2002)
Martens, J., Medabalimi, V.: On the Expressive Efficiency of Sum Product Networks. CoRR abs/1411.7717 (2014)
Molina, A., Natarajan, S., Kersting, K.: Poisson sum-product networks: a deep architecture for tractable multivariate poisson distributions. In: AAAI (2017)
Peharz, R.: Foundations of sum-product networks for probabilistic modeling. Ph.D. thesis, Graz University of Technology, SPSC (2015)
Peharz, R., Kapeller, G., Mowlaee, P., Pernkopf, F.: Modeling speech with sum-product networks: application to bandwidth extension. In: ICASSP (2014)
Poon, H., Domingos, P.: Sum-product networks: a new deep architecture. In: UAI 2011 (2011)
Queyranne, M.: Minimizing symmetric submodular functions. Math. Program. 82(1–2), 3–12 (1998)
Rahman, T., Kothalkar, P., Gogate, V.: Cutset networks: a simple, tractable, and scalable approach for improving the accuracy of chow-liu trees. In: Calders, T., Esposito, F., Hüllermeier, E., Meo, R. (eds.) ECML PKDD 2014. LNCS, vol. 8725, pp. 630–645. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-44851-9_40
Rooshenas, A., Lowd, D.: Learning sum-product networks with direct and indirect variable interactions. In: ICML (2014)
Roth, D.: On the hardness of approximate reasoning. Artif. Intell. 82(12), 273–302 (1996)
Vergari, A., Di Mauro, N., Esposito, F.: Visualizing and understanding sum-product networks. CoRR abs/1608.08266 (2016)
Vergari, A., Di Mauro, N., Esposito, F.: Simplifying, regularizing and strengthening sum-product network structure learning. In: Appice, A., Rodrigues, P.P., Santos Costa, V., Gama, J., Jorge, A., Soares, C. (eds.) ECML PKDD 2015. LNCS (LNAI), vol. 9285, pp. 343–358. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-23525-7_21
Yuan, Z., Wang, H., Wang, L., Lu, T., Palaiahnakote, S., Tan, C.L.: Modeling spatial layout for scene image understanding via a novel multiscale sum-product network. Expert Syst. Appl. 63, 231–240 (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Di Mauro, N., Esposito, F., Ventola, F.G., Vergari, A. (2017). Alternative Variable Splitting Methods to Learn Sum-Product Networks. In: Esposito, F., Basili, R., Ferilli, S., Lisi, F. (eds) AI*IA 2017 Advances in Artificial Intelligence. AI*IA 2017. Lecture Notes in Computer Science(), vol 10640. Springer, Cham. https://doi.org/10.1007/978-3-319-70169-1_25
Download citation
DOI: https://doi.org/10.1007/978-3-319-70169-1_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-70168-4
Online ISBN: 978-3-319-70169-1
eBook Packages: Computer ScienceComputer Science (R0)