Abstract
Radial basis function (RBF) networks have been successfully applied to function interpolation and classification problems among others. In this paper, we propose a basis function optimization method using a mixture density model. We generalize the Gaussian radial basis functions to arbitrary covariance matrices, in order to fully utilize the Gaussian probability density function. We also try to achieve a parsimonious network topology by using a systematic procedure. According to experimental results, the proposed method achieved fairly comparable performance with smaller number of hidden layer nodes to the conventional approach in terms of correct classification rates.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bishop, C.M.: Neural Networks for Pattern Recognition, Oxford (1995)
Chen, S., Billings, S.A., Luo, W.: Orthogonal least squares methods and their application to non-linear system identification. International Journal of Control 50(5), 1873–1896 (1989)
Chen, S., Cowan, C.F.N., Grant, P.M.: Orthogonal least squares learning algorithm for radial basis function networks. IEEE Transactions on Neural Networks 2(2), 302–309 (1991)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of Royal Statistical Society(B) 39, 1–38 (1977)
Farrokhnia, M., Jain, A.: A multi-channel filtering approach to texture segmentation. In: Proceedings of IEEE Computer Vision and Pattern Recognition Conference, pp. 346–370 (1990)
Girosi, F., Poggio, T.: Networks and the best approximation property. Biological Cybernetics 63, 169–176 (1990)
Kraaijveld, M., Duin, R.: Generalization capabilities of minimal kernel-based networks. In: Proceedings of the International Joint Conference on Neural Networks, vol. 1, pp. 843–848 (1991)
Lowe, D.: Radial basis function networks. In: Arbib, M.A. (ed.) The Handbook of Brain Theory and Neural Networks. MIT Press, Cambridge (1995)
Moody, J., Darken, C.J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1(2), 281–294 (1989)
Musavi, M., Ahmed, W., Chan, K., Faris, K., Hummels, D.: On training of radial basis function classifiers. Neural Networks 5, 595–603 (1992)
Pachowicz, P.W.: A learning-based Semi-autonomous Evolution of Object Models for Adaptive Object Recognition. IEEE Trans. on Systems, Man, and Cybernetics 24-8, 1191–1207 (1994)
Silverman, B.W.: On the Estimation of a Probability Density Function by the Maximum Penalized Likelihood Method. The Annals of Statistics 10, 795–810 (1982)
Titterington, D.M., Smith, A.F.M., Makov, U.E.: Statistical Analysis of Finite Mixture Distributions. Wiley, Chichester (1985)
Yingwei, L., Sundararajan, N., Saratchandran, P.: A sequential learning scheme for function approximation using minimal radial basis function neural networks. Neural Computation 9, 461–478 (1997)
Xu, L., Jordan, M.I.: On Convergence Properties of the EM Algorithm for Gaussian Mixtures. Neural Computation 8, 129–151 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ahn, S.M., Baik, S. (2005). Minimal RBF Networks by Gaussian Mixture Model. In: Huang, DS., Zhang, XP., Huang, GB. (eds) Advances in Intelligent Computing. ICIC 2005. Lecture Notes in Computer Science, vol 3644. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11538059_95
Download citation
DOI: https://doi.org/10.1007/11538059_95
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28226-6
Online ISBN: 978-3-540-31902-3
eBook Packages: Computer ScienceComputer Science (R0)