Abstract
We address here the problem of architecture selection for an RBF network designed for classification purposes. Given a training set, the RBF network produces an estimate of the Probability Density Function (PDF) in terms of a mixture of l uncorrelated Gaussian functions, where l is the number of hidden neurons. Using uncorrelated Gaussians alleviates the heavy computational burden of estimating the full covariance matrix. However, the simplicity of such building blocks has to be paid for by the relatively large numbers of units needed to approximate the density of correlated data. We define two scalar parameters to describe the complexity of the data to be modelled and study the relationship between the complexity of the data and the complexity of the best approximating network.
Preview
Unable to display preview. Download preview PDF.
References
H. Akaike. A new look at the statistical model identification. IEEE trans. on Automatic Control, AC-19(6):716–723, 1974.
A. R. Barron and T. M. Cover. Minimum complexity density-estimation. IEEE trans. on Information Theory, 37(4):1034–1054, 1991.
D. B. Fogel. An information criterion for optimal neural network selection. IEEE Trans. on Neural Networks, 2(5):490–497, 1991.
N. Murata, S. Yoshizawa, and S. Amari. Network information criterion — determining the number of hidden units for an artificial neural network model. IEEE Trans. on Neural Networks, 5(6):865–872, 1994.
L. Sardo and J. Kittler. Maximum likelihood estimators for gaussian mixtures model selection. Submitted to Neural Networks.
L. Sardo and J. Kittler. Minimum complexity pdf estimation for correlated data. In Proceedings of ICPR96 (International Conference on Pattern recognition), Vienna, 25–30 August1996.
L. Sardo and J. Kittler. Minimum complexity estimator for rbf networks architecture selection. In Proceedings of ICNN96 (International Conference on Neural Networks), Washington DC, 3–6 June 1996.
G. Schwarz. Estimating the dimension of a model. The Annals of Statistics, 6(2):461–464, 1978.
V. Vysniauskas, F. C. A. Groen, and B. J. A. Kröse. The optimal number of learning samples and hidden units in function approximation with a feedforward network. Technical Report CS-93-15, Dept. of Comp. Sys, Univ. of Amsterdam, Nov. 1993.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1996 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sardo, L., Kittler, J. (1996). Asymptotic complexity of an RBF NN for correlated data representation. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_16
Download citation
DOI: https://doi.org/10.1007/3-540-61510-5_16
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-61510-1
Online ISBN: 978-3-540-68684-2
eBook Packages: Springer Book Archive