Abstract
Radial basis function networks (RBF) are efficient general function approximators. They show good generalization performance and they are easy to train. Due to theoretical considerations RBFs commonly use Gaussian activation functions. It has been shown that these tight restrictions on the choice of possible activation functions can be relaxed in practical applications. As an alternative difference of sigmoidal functions (SRBF) have been proposed. SRBFs have an additional parameter which increases the ability of a network node to adapt its shape to input patterns, even in cases where Gaussian functions fail.
In this paper we follow the idea of incorporating greater flexibility into radial basis functions. We propose to use splines as localized deformable radial basis functions (DRBF). We present initial results which show that DRBFs can be evaluated more effectively then SRBFs. We show that even with enhanced flexibility the network is easy to train and convergences robustly towards smooth solutions.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Falcao, A., Langlois, T., Wichert, A.: Flexible kernels for RBF networks - Brain inspired cognitive systems - selected papers from the 1st international conference on brain inspired cognitive systems. Neurocomputing 69, 2356–2359 (2006)
Poggio, T., Girosi, F.: A theory of networks for approximation and learning. A.I. Memo No.1140 (1989)
Lee, C., Chung, P., Tsai, J., Chang, C.: Robust radial basis function neural networks. Systems, Man and Cybernetics, Part B, IEEE Transactions 29, 674–685 (1999)
Huang, G.B., Saratchandran, P., Sundararajan, N.: An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks. IEEE Transactions on Systems, Man and Cybernetics, Part B 34(6), 2284–2292 (2004)
Huang, G., Saratchandran, P., Sundararajan, N.: A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation. IEEE Trans. on Neural Networks 16(1), 57–67 (2005)
Geva, S., Sitte, J.: A constructive method for multivariate function approximation by multilayer perceptrons. IEEE Trans. on Neural Networks 3(4), 621–624 (1991)
Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical Recipes in C++, 2nd edn. Cambridge Unviversity Press, Cambridge (2002)
Marsland, S., Shapiro, J., Nehmzow, U.: A self-organising network that grows when required. Neural Networks 15, 1041–1058 (2002)
MacKay, D.: Bayesian interpolation. Neural Computation 4(3), 415–447 (1992)
Schwenker, F., Kestler, H., Palm, G.: Three learning phases for radial–basis–function networks. Neural Networks 14(4-5), 439–458 (2001)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hübner, W., Mallot, H.A. (2007). Deformable Radial Basis Functions. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds) Artificial Neural Networks – ICANN 2007. ICANN 2007. Lecture Notes in Computer Science, vol 4668. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74690-4_42
Download citation
DOI: https://doi.org/10.1007/978-3-540-74690-4_42
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74689-8
Online ISBN: 978-3-540-74690-4
eBook Packages: Computer ScienceComputer Science (R0)