Abstract
We study strong universal consistency and the rates of convergence of nonlinear regression function learning algorithms using normalized radial basis function networks. The parameters of the network including centers, covariance matrices and synaptic weights are trained by the empirical risk minimization. We show the rates of convergence for the networks whose parameters are learned by the complexity regularization.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Anthony, M., Bartlett, P.L.: Neural Network Learning: Theoretical Foundations. Cambridge University Press, Cambridge (1999)
Barron, A.R., Birgé, L., Massart, P.: Risk bounds for model selection via penalization. Probability Theory and Related Fields 113, 301–413 (1999)
Cybenko, G.: Approximations by superpositions of sigmoidal functions. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)
Devroye, L., Györfi, L., Lugosi, G.: Probabilistic Theory of Pattern Recognition. Springer, New York (1996)
Duchon, J.: Sur l’erreur d’interpolation des fonctions de plusieurs variables par les \(D\sp{m}\)-splines. RAIRO Anal. Numér. 12(4), 325–334 (1978)
Faragó, A., Lugosi, G.: Strong universal consistency of neural network classifiers. IEEE Trans. on Information Theory 39, 1146–1151 (1993)
Girosi, F., Jones, M., Poggio, T.: Regularization theory and neural network architectures. Neural Computation 7, 219–267 (1995)
Györfi, L., Kohler, M., Krzyżak, A., Walk, H.: A Distribution-Free Theory of Nonparametric Regression. Springer, New York (2002)
Hornik, K., Stinchocombe, S., White, H.: Multilayer feed-forward networks are universal approximators. Neural Networks 2, 359–366 (1989)
Kolmogorov, A.N., Tihomirov, V.M.: ε-entropy and ε-capacity of sets in function spaces. Translations of the American Mathematical Society 17, 277–364 (1961)
Krzyżak, A., Linder, T., Lugosi, G.: Nonparametric estimation and classification using radial basis function nets and empirical risk minimization. IEEE Trans. Neural Networks 7(2), 475–487 (1996)
Krzyżak, A., Linder, T.: Radial basis function networks and complexity regularization in function learning. IEEE Trans. Neural Networks 9(2), 247–256 (1998)
Krzyżak, A., Niemann, H.: Convergence and rates of convergence of radial basis functions networks in function learning. Nonlinear Analysis 47, 281–292 (2001)
Krzyżak, A., Schäfer, D.: Nonparametric regression estimation by normalized radial basis function networks. IEEE Transactions on Information Theory 51(3), 1003–1010 (2005)
Lugosi, G., Zeger, K.: Nonparametric estimation via empirical risk minimization. IEEE Trans. on Information Theory 41, 677–687 (1995)
Lugosi, G., Nobel, A.: Adaptive model selection using empirical complexities. Annals of Statistics 27, 1830–1864 (1999)
Moody, J., Darken, J.: Fast learning in networks of locally-tuned processing units. Neural Computation 1, 281–294 (1989)
Park, J., Sandberg, I.W.: Universal approximation using Radial-Basis-Function networks. Neural Computation 3, 246–257 (1991)
Pollard, D.: Convergence of Stochastic Processes. Springer, New York (1984)
Shorten, R., Murray-Smith, R.: Side effects of normalising radial basis function networks. International Journal of Neural Systems 7, 167–179 (1996)
Specht, D.F.: Probabilistic neural networks. Neural Networks 3, 109–118 (1990)
Ripley, B.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)
Vapnik, V.N., Ya, A.: Chervonenkis, On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications 16, 264–280 (1971)
Vapnik, V.N.: Estimation of Dependences Based on Empirical Data, 2nd edn. Springer, New York (1999)
Xu, L., Krzyżak, A., Yuille, A.L.: On radial basis function nets and kernel regression: approximation ability, convergence rate and receptive field size. Neural Networks 7, 609–628 (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Krzyżak, A., Schäfer, D. (2006). Nonlinear Function Learning by the Normalized Radial Basis Function Networks. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds) Artificial Intelligence and Soft Computing – ICAISC 2006. ICAISC 2006. Lecture Notes in Computer Science(), vol 4029. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11785231_6
Download citation
DOI: https://doi.org/10.1007/11785231_6
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-35748-3
Online ISBN: 978-3-540-35750-6
eBook Packages: Computer ScienceComputer Science (R0)