Abstract
We discuss two kernel based learning methods, namely the Regularization Networks (RN) and the Radial Basis Function (RBF) Networks. The RNs are derived from the regularization theory, they had been studied thoroughly from a function approximation point of view, and they posses a sound theoretical background. The RBF networks represent a model of artificial neural networks with both neuro-physiological and mathematical motivation. In addition they may be treated as a generalized form of Regularization Networks. We demonstrate the performance of both approaches on experiments, including both benchmark and real-life learning tasks. We claim that RN and RBF networks are comparable in terms of generalization error, but they differ with respect to their model complexity. The RN approach usually leads to solutions with higher number of base units, thus, the RBF networks can be used as a ’cheaper’ alternative. This allows to utilize the RBF networks in modeling tasks with large amounts of data, such as time series prediction or semantic web classification.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Haykin, S.: Neural Networks: a comprehensive foundation, 2nd edn. Tom Robins (1999)
Tikhonov, A., Arsenin, V.: Solutions of Ill-posed Problems, W.H. Winston, Washington, D.C (1977)
Girosi, F., Jones, M., Poggio, T.: Regularization theory and Neural Networks architectures. Neural Computation 7, 219–269 (1995)
Poggio, T., Smale, S.: The mathematics of learning: Dealing with data. Notices of the AMS 50, 537–544 (2003)
Girosi, F.: An equivalence between sparse approximation and support vector machines. Technical report, A.I. Memo No. 1606 (1997)
Hansen, P.C.: Rank-Deficient and Discrete Ill-Posed Problems: Numerical Aspects of Linear Inversion. SIAM, Philadelphia (1998)
Powel, M.: Radial basis functions for multivariable interpolation: A review. In: IMA Conference on Algorithms for the Approximation of Functions and Data, RMCS, Shrivenham, England, pp. 143–167 (1985)
Neruda, R., Kudová, P.: Hybrid learning of RBF networks. Neural Networks World 12, 573–585 (2002)
Neruda, R., Kudová, P.: Learning methods for RBF neural networks. Future Generation of Computer Systems (2005) (in print)
Prechelt, L.: Proben1 – a set of benchmarks and benchmarking rules for neural network training algorithms. Technical Report 21/94, Universitaet Karlsruhe (1994)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kudová, P., Neruda, R. (2005). Kernel Based Learning Methods: Regularization Networks and RBF Networks. In: Winkler, J., Niranjan, M., Lawrence, N. (eds) Deterministic and Statistical Methods in Machine Learning. DSMML 2004. Lecture Notes in Computer Science(), vol 3635. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11559887_8
Download citation
DOI: https://doi.org/10.1007/11559887_8
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29073-5
Online ISBN: 978-3-540-31728-9
eBook Packages: Computer ScienceComputer Science (R0)