Skip to main content

Minimal RBF Networks by Gaussian Mixture Model

  • Conference paper
Advances in Intelligent Computing (ICIC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3644))

Included in the following conference series:

  • 3496 Accesses

Abstract

Radial basis function (RBF) networks have been successfully applied to function interpolation and classification problems among others. In this paper, we propose a basis function optimization method using a mixture density model. We generalize the Gaussian radial basis functions to arbitrary covariance matrices, in order to fully utilize the Gaussian probability density function. We also try to achieve a parsimonious network topology by using a systematic procedure. According to experimental results, the proposed method achieved fairly comparable performance with smaller number of hidden layer nodes to the conventional approach in terms of correct classification rates.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Neural Networks for Pattern Recognition, Oxford (1995)

    Google Scholar 

  2. Chen, S., Billings, S.A., Luo, W.: Orthogonal least squares methods and their application to non-linear system identification. International Journal of Control 50(5), 1873–1896 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  3. Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal least squares learning algorithm for radial basis function networks. IEEE Transactions on Neural Networks 2(2), 302–309 (1991)

    Article  Google Scholar 

  4. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of Royal Statistical Society(B) 39, 1–38 (1977)

    MATH  MathSciNet  Google Scholar 

  5. Farrokhnia, M., Jain, A.: A multi-channel filtering approach to texture segmentation. In: Proceedings of IEEE Computer Vision and Pattern Recognition Conference, pp. 346–370 (1990)

    Google Scholar 

  6. Girosi, F., Poggio, T.: Networks and the best approximation property. Biological Cybernetics 63, 169–176 (1990)

    Article  MATH  MathSciNet  Google Scholar 

  7. Kraaijveld, M., Duin, R.: Generalization capabilities of minimal kernel-based networks. In: Proceedings of the International Joint Conference on Neural Networks, vol. 1, pp. 843–848 (1991)

    Google Scholar 

  8. Lowe, D.: Radial basis function networks. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks. MIT Press, Cambridge (1995)

    Google Scholar 

  9. Moody, J., Darken, C.J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1(2), 281–294 (1989)

    Article  Google Scholar 

  10. Musavi, M., Ahmed, W., Chan, K., Faris, K., Hummels, D.: On training of radial basis function classifiers. Neural Networks 5, 595–603 (1992)

    Article  Google Scholar 

  11. Pachowicz, P.W.: A learning-based Semi-autonomous Evolution of Object Models for Adaptive Object Recognition. IEEE Trans. on Systems, Man, and Cybernetics 24-8, 1191–1207 (1994)

    Article  Google Scholar 

  12. Silverman, B.W.: On the Estimation of a Probability Density Function by the Maximum Penalized Likelihood Method. The Annals of Statistics 10, 795–810 (1982)

    Article  MATH  MathSciNet  Google Scholar 

  13. Titterington, D.M., Smith, A.F.M., Makov, U.E.: Statistical Analysis of Finite Mixture Distributions. Wiley, Chichester (1985)

    MATH  Google Scholar 

  14. Yingwei, L., Sundararajan, N., Saratchandran, P.: A sequential learning scheme for function approximation using minimal radial basis function neural networks. Neural Computation 9, 461–478 (1997)

    Article  MATH  Google Scholar 

  15. Xu, L., Jordan, M.I.: On Convergence Properties of the EM Algorithm for Gaussian Mixtures. Neural Computation 8, 129–151 (1996)

    Article  Google Scholar 

  16. http://sunsite.berkeley.edu/AerialPhotos/vbzj.html#index

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ahn, S.M., Baik, S. (2005). Minimal RBF Networks by Gaussian Mixture Model. In: Huang, DS., Zhang, XP., Huang, GB. (eds) Advances in Intelligent Computing. ICIC 2005. Lecture Notes in Computer Science, vol 3644. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11538059_95

Download citation

  • DOI: https://doi.org/10.1007/11538059_95

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28226-6

  • Online ISBN: 978-3-540-31902-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics