Abstract
The previous chapter introduced inference in discrete variable Bayesian networks. This used evidence propagation on the junction tree to find marginal distributions of interest. This chapter presents a tutorial introduction to some of the various types of calculations which can also be performed with the junction tree, specifically:
-
Sampling.
-
Most likely configurations.
-
Fast retraction.
-
Gaussian and conditional Gaussian models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Buntine, W. L. (1994). Operations for learning with graphical models. Journal of Artificial Intelligence Research, 2, pp. 159–225.
Cowell, R. G. (1997). Sampling without replacement in junction trees, Research Report 15, Department of Actuarial Science and Statistics, City University, London.
Cowell, R. G. and Dawid, A. P. (1992). Fast retraction of evidence in a probabilistic expert system. Statistics and Computing, 2, pp. 37–40.
Dawid, A. P. (1992). Applications of a general propagation algorithm for probabilistic expert systems. Statistics and Computing, 2, pp. 25–36.
Henrion, M. (1988). Propagation of uncertainty by probabilistic logic sampling in Bayes’ networks. In Uncertainty in Artificial Intelligence, (ed J Lemmer and L. N. Kanal ), pp. 149–64. North-Holland, Amsterdam.
Jensen, F., Jensen, F. V., and Dittmer, S. L. (March 1994). From influence diagrams to junction trees. Technical Report R-94–2013, Department of Mathematics and Computer Science Aalborg University, Denmark.
Kjærulff, U. (1993). A computational scheme for reasoning in dynamic probabilistic networks. Research Report R-93–2018, Department of Mathematics and Computer Science, Aalborg University, Denmark.
Lauritzen, S. L. (1992). Propagation of probabilities, means and variances in mixed graphical association models. Journal of the American Statistical Association, 87, pp. 1098–108.
Nilsson, D. (1994). An algorithm for finding the most probable configurations of discrete variables that are specified in probabilistic expert systems. M.Sc. Thesis, Department of Mathematical Statistics, University of Copenhagen.
Nilsson, D. (1997). An efficient algorithm for finding the M most probable configurations in a probabilistic expert system. Submitted to Statistics and Computing.
Shachter, R. D., Andersen, S. K., and Szolovits, P. (1994). Global conditioning for probabilistic inference in belief networks. In Proceedings of the Tenth Conference on Uncertainty in Artifical Intelligence, pp 514–522.
Shachter, R. and Kenley, C. (1989). Gaussian influence diagrams. Management Science, 35, pp. 527–50.
Shachter, R. and Peot, M. (1989). Simulation approaches to general probabilistic inference on belief networks. In Uncertainty in Artificial Intelligence5, (ed. M. Hennon, R. D. Shachter, L. Kanal, and J. Lemmer), pp. 221–31. North-Holland, North-Holland.
Smith, J. Q., French, S., and Raynard, D. (1995). An efficient graphical algorithm for updating the estimates of the dispersal of gaseous waste after an accidental release. In Probabilistic reasoning and Bayesian belief networks, (ed. A. Gammerman ), pp. 125–44. Alfred Waller, Henley-on-Thames.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Cowell, R. (1998). Advanced Inference in Bayesian Networks. In: Jordan, M.I. (eds) Learning in Graphical Models. NATO ASI Series, vol 89. Springer, Dordrecht. https://doi.org/10.1007/978-94-011-5014-9_2
Download citation
DOI: https://doi.org/10.1007/978-94-011-5014-9_2
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-010-6104-9
Online ISBN: 978-94-011-5014-9
eBook Packages: Springer Book Archive