Skip to main content

A Bayesian Classifier by Using the Adaptive Construct Algorithm of the RBF Networks

  • Conference paper
Advances in Neural Networks – ISNN 2004 (ISNN 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3173))

Included in the following conference series:

  • 1355 Accesses

Abstract

In paper we propose a Bayesian classifier for multiclass problem by using the merging RBF networks. The estimation of probability density function (PDF) with a Gaussian mixture model is used to update the expectation maximization algorithm. The centers and variances of RBF networks are gradually updated to merge the basis unites by the supervised gradient descent of the error energy function. The algorithms are used to construct the RBF networks and to reduce the number of basis units. The experimental results show the validity of our method which gives a smaller number of basis units and obviously outperforms the conventional RBF learning technique.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Miller, D.J., Uyar, H.S.: Combined Learning and Use for a Mixture Model Equivalent to the RBF Classifier. Neural Computation 10, 281–293 (1998)

    Article  Google Scholar 

  2. Chen, S., Wu, Y., Luk, B.L.: Combined Genetic Algorithm Optimization and Regularized Orthogonal Least Squares Learning for Radial Basis Function. IEEE Transactions on Neural Networks 10, 1239–1243 (1999)

    Article  Google Scholar 

  3. Alba, J.L., Docio, L., Docampo, D., et al.: Growing Gaussian Mixtures Network for Classification Application. Signal Processing 76, 43–60 (1999)

    Article  MATH  Google Scholar 

  4. Utsugi, A.: Density Estimation by Mixture Models with Smoothing Priors. Neural Computation 10, 2115–2135 (1998)

    Article  Google Scholar 

  5. Lotlikar, R., Kothar, R.: Bayes-Optimality Motivated Linear and Multilayered Perceptron-Based Dimensionality Reduction. IEEE Transactions on Neural Networks 11, 452–463 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Jiang, M., Liu, D., Deng, B., Gielen, G. (2004). A Bayesian Classifier by Using the Adaptive Construct Algorithm of the RBF Networks. In: Yin, FL., Wang, J., Guo, C. (eds) Advances in Neural Networks – ISNN 2004. ISNN 2004. Lecture Notes in Computer Science, vol 3173. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-28647-9_144

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-28647-9_144

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22841-7

  • Online ISBN: 978-3-540-28647-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics