Skip to main content

A Study of Conditional Independence Change in Learning Probabilistic Network

  • Conference paper
  • First Online:
Rough Sets and Current Trends in Computing (RSCTC 2000)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2005))

Included in the following conference series:

  • 5104 Accesses

Abstract

This paper discusses the change for the conditional independence set in learning Probabilistic Network based on markov property. They are generalized into several cases for all of possible changes. We show that these changes are sound and complete. Any structure learning methods for the Decomposiable Markov Network and Bayesian Network will fall into these cases. This study indicates which kind of domain model can be learned and which can not. It suggests that prior knowledge about the problem domain decides the basic frame for the future learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Y. Xiang, S.K.M. Wong and N. Cercone: A ‘Microscopic’ Study of Minimum Entropy Search in Learning Decomposable Markov Networks. Machine Learning, Vol.26, No.1, (1997) 65–92

    Article  MATH  Google Scholar 

  2. Robert G. Cowell, A. Philip Dawid, Steffen L. Lauritzen, David J. Spiegelhater: Probabilistic Networks and Expert Systems. Springer-Verlag, Berlin Heidelberg New York (1999)

    Google Scholar 

  3. J. Pearl: Probailistic Reasoning in Intelligent Systems: Nerworks of Plausible Inference. Morgan Kaufmann (1988).

    Google Scholar 

  4. D. Heckerman, D. Geiger, and D.M. Chickering. Learning Bayesian Network: the com-bination of knowledge and statistical data. Machine Learning, 20:197–243, 1995.

    MATH  Google Scholar 

  5. S.K.M. Wong and Y. Xiang: Construction of a Markov network from data for probailistic inference. In Proc. 3rd Inter. Workshop on Rough Sets and Soft Computing, San Jose, (1994) 562–569.

    Google Scholar 

  6. P. Spirtes and C. Glymour: An algorithm for fast recovery of sparse causal graphs. Social Science Computer Review, 9(1), (1991) 62–73.

    Article  Google Scholar 

  7. R.M. Fung and S.L. Crawford: Constructor:a system for the induction of probabilistic models. In Proc. of AAAI, MIT Press, (1990) 762–769.

    Google Scholar 

  8. W. Buntine: A guide to the literature on learning probabilistic networks from data. IEEE Trans. on Knowledge and data engineering, 8(2), (1996) 195–210.

    Article  Google Scholar 

  9. D.M. Chickering: Learning equivalence classes of Bayesian network structures. In Proc. 12th conf. on Uncertainty in Artificial Intelligence, Morgan Kaufmann, (1996) 150–157.

    Google Scholar 

  10. A.P. Dawid and S.L. Lauritzen: Hyper Markov laws in the statistical analysis of decomposable graphical models. Annals of Statistics, 21(3), (1993) 1272–1317.

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lin, T., Huang, Y. (2001). A Study of Conditional Independence Change in Learning Probabilistic Network. In: Ziarko, W., Yao, Y. (eds) Rough Sets and Current Trends in Computing. RSCTC 2000. Lecture Notes in Computer Science(), vol 2005. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45554-X_56

Download citation

  • DOI: https://doi.org/10.1007/3-540-45554-X_56

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43074-2

  • Online ISBN: 978-3-540-45554-7

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics