Skip to main content

Nonlinear Function Learning Using Radial Basis Function Networks: Convergence and Rates

  • Conference paper
  • 1654 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5097))

Abstract

We apply normalized RBF networks to the problem of learning nonlinear regression functions. The parameters of the networks are learned by empirical risk minimization and complexity regularization. We study convergence of the RBF networks for various radial kernels as the number of training samples increases. The rates of convergence are also examined.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   189.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anthony, M., Bartlett, P.L.: Neural Network Learning: Theoretical Foundations. Cambridge University Press, Cambridge (1999)

    MATH  Google Scholar 

  2. Barron, A.R.: Universal approximation bounds for superpositions of a sigmoidal function. IEEE Trans. on Information Theory 39, 930–945 (1993)

    Article  MathSciNet  MATH  Google Scholar 

  3. Barron, A.R.: Approximation and estimation bounds for artificial neural networks. Machine Learning 14, 115–133 (1994)

    MATH  Google Scholar 

  4. Cybenko, G.: Approximations by superpositions of sigmoidal functions. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  5. Devroye, L., Györfi, L., Lugosi, G.: Probabilistic Theory of Pattern Recognition. Springer, New York (1996)

    MATH  Google Scholar 

  6. Duchon, J.: Sur l’erreur d’interpolation des fonctions de plusieurs variables par les Dm-splines. RAIRO Anal. Numér. 12(4), 325–334 (1978)

    MathSciNet  MATH  Google Scholar 

  7. Faragó, A., Lugosi, G.: Strong universal consistency of neural network classifiers. IEEE Trans. on Information Theory 39, 1146–1151 (1993)

    Article  MATH  Google Scholar 

  8. Girosi, F.: Regularization theory, radial basis functions and networks. In: Cherkassky, V., Friedman, J.H., Wechsler, H. (eds.) From Statistics to Neural Networks. Theory and Pattern recognition Applications, pp. 166–187. Springer, Berlin (1992)

    Google Scholar 

  9. Girosi, F., Anzellotti, G.: Rates of convergence for radial basis functions and neural networks. In: Mammone, R.J. (ed.) Artificial Neural Networks for Speech and Vision, pp. 97–113. Chapman and Hall, London (1993)

    Google Scholar 

  10. Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural network architectures. Neural Computation 7, 219–267 (1995)

    Article  Google Scholar 

  11. Grenander, U.: Abstract Inference. Wiley, New York (1981)

    MATH  Google Scholar 

  12. Györfi, L., Kohler, M., Krzyżak, A., Walk, H.: A DistributionFree Theory of Nonparametric Regression. Springer, New York (2002)

    Google Scholar 

  13. Haussler, D.: Decision theoretic generalizations of the PAC model for neural net and other learning applications. Information and Computation 100, 78–150 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  14. Hornik, K., Stinchocombe, S., White, H.: Multilayer feed-forward networks are universal approximators. Neural Networks 2, 359–366 (1989)

    Article  Google Scholar 

  15. Kolmogorov, A.N., Tihomirov, V.M.: ε-entropy and ε-capacity of sets in function spaces. Translations of the American Mathematical Society 17, 277–364 (1961)

    MathSciNet  Google Scholar 

  16. Krzyżak, A., Linder, T., Lugosi, G.: Nonparametric estimation and classification using radial basis function nets and empirical risk minimization. IEEE Trans. Neural Networks 7(2), 475–487 (1996)

    Article  Google Scholar 

  17. Krzyżak, A., Linder, T.: Radial basis function networks and complexity regularization in function learning. IEEE Trans. Neural Networks 9(2), 247–256 (1998)

    Article  Google Scholar 

  18. Krzyżak, A., Niemann, H.: Convergence and rates of convergence of radial basis functions networks in function learning. Nonlinear Analysis 47, 281–292 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  19. Krzyżak, A., Schäfer, D.: Nonparametric regression estimation by normalized radial basis function networks. IEEE Transactions on Information Theory 51(3), 1003–1010 (2005)

    Article  Google Scholar 

  20. Lugosi, G., Zeger, K.: Nonparametric estimation via empirical risk minimization. IEEE Trans. on Information Theory 41, 677–687 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  21. Moody, J., Darken, J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)

    Article  Google Scholar 

  22. Park, J., Sandberg, I.W.: Universal approximation using Radial-Basis-Function networks. Neural Computation 3, 246–257 (1991)

    Article  Google Scholar 

  23. Park, J., Sandberg, I.W.: Approximation and Radial-Basis-Function networks. Neural Computation 5, 305–316 (1993)

    Article  Google Scholar 

  24. Pollard, D.: Convergence of Stochastic Processes. Springer, New York (1984)

    MATH  Google Scholar 

  25. Rissanen, J.: A universal prior for integers and estimation by minimum description length. Annals of Statistics 11, 416–431 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  26. Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)

    MATH  Google Scholar 

  27. Shorten, R., Murray-Smith, R.: Side effects of normalising radial basis function networks. International Journal of Neural Systems 7, 167–179 (1996)

    Article  Google Scholar 

  28. Specht, D.F.: Probabilistic neural networks. Neural Networks 3, 109–118 (1990)

    Article  Google Scholar 

  29. Vapnik, V.N.: Estimation of Dependences Based on Empirical Data, 2nd edn. Springer, New York (1999)

    Google Scholar 

  30. Xu, L., Krzyżak, A., Yuille, A.L.: On radial basis function nets and kernel regression: approximation ability, convergence rate and receptive field size. Neural Networks 7, 609–628 (1994)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Leszek Rutkowski Ryszard Tadeusiewicz Lotfi A. Zadeh Jacek M. Zurada

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Krzyżak, A., Schäfer, D. (2008). Nonlinear Function Learning Using Radial Basis Function Networks: Convergence and Rates. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing – ICAISC 2008. ICAISC 2008. Lecture Notes in Computer Science(), vol 5097. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69731-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69731-2_11

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69572-1

  • Online ISBN: 978-3-540-69731-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics