Skip to main content

Asymptotic complexity of an RBF NN for correlated data representation

  • Oral Presentations: Theory Theory III: Generalization I
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN 96 (ICANN 1996)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1112))

Included in the following conference series:

Abstract

We address here the problem of architecture selection for an RBF network designed for classification purposes. Given a training set, the RBF network produces an estimate of the Probability Density Function (PDF) in terms of a mixture of l uncorrelated Gaussian functions, where l is the number of hidden neurons. Using uncorrelated Gaussians alleviates the heavy computational burden of estimating the full covariance matrix. However, the simplicity of such building blocks has to be paid for by the relatively large numbers of units needed to approximate the density of correlated data. We define two scalar parameters to describe the complexity of the data to be modelled and study the relationship between the complexity of the data and the complexity of the best approximating network.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H. Akaike. A new look at the statistical model identification. IEEE trans. on Automatic Control, AC-19(6):716–723, 1974.

    Google Scholar 

  2. A. R. Barron and T. M. Cover. Minimum complexity density-estimation. IEEE trans. on Information Theory, 37(4):1034–1054, 1991.

    Google Scholar 

  3. D. B. Fogel. An information criterion for optimal neural network selection. IEEE Trans. on Neural Networks, 2(5):490–497, 1991.

    Google Scholar 

  4. N. Murata, S. Yoshizawa, and S. Amari. Network information criterion — determining the number of hidden units for an artificial neural network model. IEEE Trans. on Neural Networks, 5(6):865–872, 1994.

    Google Scholar 

  5. L. Sardo and J. Kittler. Maximum likelihood estimators for gaussian mixtures model selection. Submitted to Neural Networks.

    Google Scholar 

  6. L. Sardo and J. Kittler. Minimum complexity pdf estimation for correlated data. In Proceedings of ICPR96 (International Conference on Pattern recognition), Vienna, 25–30 August1996.

    Google Scholar 

  7. L. Sardo and J. Kittler. Minimum complexity estimator for rbf networks architecture selection. In Proceedings of ICNN96 (International Conference on Neural Networks), Washington DC, 3–6 June 1996.

    Google Scholar 

  8. G. Schwarz. Estimating the dimension of a model. The Annals of Statistics, 6(2):461–464, 1978.

    Google Scholar 

  9. V. Vysniauskas, F. C. A. Groen, and B. J. A. Kröse. The optimal number of learning samples and hidden units in function approximation with a feedforward network. Technical Report CS-93-15, Dept. of Comp. Sys, Univ. of Amsterdam, Nov. 1993.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Christoph von der Malsburg Werner von Seelen Jan C. Vorbrüggen Bernhard Sendhoff

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sardo, L., Kittler, J. (1996). Asymptotic complexity of an RBF NN for correlated data representation. In: von der Malsburg, C., von Seelen, W., Vorbrüggen, J.C., Sendhoff, B. (eds) Artificial Neural Networks — ICANN 96. ICANN 1996. Lecture Notes in Computer Science, vol 1112. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-61510-5_16

Download citation

  • DOI: https://doi.org/10.1007/3-540-61510-5_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-61510-1

  • Online ISBN: 978-3-540-68684-2

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics