Skip to main content

Learning Bayesian-Neural Network from Mixed-Mode Data

  • Conference paper
Neural Information Processing (ICONIP 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4232))

Included in the following conference series:

Abstract

For reasoning with uncertain knowledge the use of probability theory has been broadly investigated. This paper proposed a novel probabilistic network named Bayesian-Neural Network (BNN). BNN reduces computational complexity by dividing input attribute set into two parts, each modelled by Bayesian network or Neural network. The outputs produced by different classifiers is then solved in the output space by estimating the class-conditional structural mixtures. Empirical studies on a set of natural domains show that BNN has clear advantages with respect to the generalization ability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Lin, W.M., Lin, C.H., Tasy, M.X.: Transformer-fault diagnosis by integrating field data and standard codes with training enhancible adaptive probabilistic network. IEE Proceedings of Generation, Transmission and Distribution 152, 335–341 (2005)

    Article  Google Scholar 

  2. Tseng, C.L., Chen, Y.H., Xu, Y.Y., Pao, H.T., Fu, H.-C.: A self-growing probabilistic decision-based neural network with automatic data clustering. Neurocomputing 61, 21–38 (2004)

    Article  Google Scholar 

  3. Malgorzata, S.A.S.: Probabilistic fault localization in communication systems using belief networks Steinder. IEEE/ACM Transactions on Networking 12, 809–822 (2004)

    Article  Google Scholar 

  4. Specht, D.F.: Probabilistic neural networks. Neural Networks 3, 109–118 (1990)

    Article  Google Scholar 

  5. Kononenko, I.: Semi-naive Bayesian classifier. In: Kodratoff, Y. (ed.) EWSL 1991. LNCS, vol. 482, pp. 206–219. Springer, Heidelberg (1991)

    Chapter  Google Scholar 

  6. Langley, P., Iba, W., Thompson, K.: An analysis of Bayesian classifiers. In: Proceedings of AAAI 1992, vol. 92, pp. 223–228 (1992)

    Google Scholar 

  7. Friedman, N., Geiger, D., Goldszmidt, M.: Bayesian Network Classifiers. Machine Learning 29, 131–163 (1997)

    Article  MATH  Google Scholar 

  8. Pazzani, M.J., Keogh, E.J.: Learning Augmented Bayesian Classifiers: A Comparison of Distribution-Based and Classification-Based Approaches. In: Proceedings of the Seventh International Workshop on Artificial Intelligence and Statistics, pp. 225–230 (1999)

    Google Scholar 

  9. Hendler, J.: Developing hybrid symbolic/connectionist models. In: Advances in Connectionist and Neural Computation Theory, pp. 165–179 (1991)

    Google Scholar 

  10. Sun, R., Soolanan, S.L.A.: Working Notes of the AAAI Workshop on Integrating Neural and Symbolic Processes, pp. 205–217 (1992)

    Google Scholar 

  11. Friedman, N., Goldszmidt, M., Thomas, J.L.: Bayesian Network Classification with Continuous Attributes: Getting the Best of Both Discretization and Parametric Fitting. In: Proceedings of the International Conference on Machine Learning, pp. 179–187 (1998)

    Google Scholar 

  12. Dougherty, J.: Supervised and unsupervied discretization of coninuous features. In: Proceedings of the 12th International Conference on Machine Learning, pp. 194–201 (1995)

    Google Scholar 

  13. Chow, C.K., Liu, C.N.: Approximating Discrete Probability Distributions with Dependence Trees. IEEE Transactions on Information Theory 14, 462–467 (1968)

    Article  MATH  Google Scholar 

  14. Hunter, A.: Feature Selection Using Probabilistic Neural Networks. Neural Computing and Applications 9, 124–132 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Limin, W. (2006). Learning Bayesian-Neural Network from Mixed-Mode Data. In: King, I., Wang, J., Chan, LW., Wang, D. (eds) Neural Information Processing. ICONIP 2006. Lecture Notes in Computer Science, vol 4232. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11893028_76

Download citation

  • DOI: https://doi.org/10.1007/11893028_76

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-46479-2

  • Online ISBN: 978-3-540-46480-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics