Abstract
Numerous studies have addressed nonlinear functional approximation by multilayer perceptrons (MLPs) and RBF networks as a special case of the more general mapping problem. The performance of both these supervised network models intimately depends on the efficiency of their learning process. This paper presents an unsupervised recurrent neural network, based on the recurrent Mean Field Theory (MFT) network model, that finds a least-squares approximation to an arbitrary L2 function, given a set of Gaussian radially symmetric basis functions (RBFs). Essential is the reformulation of RBF approximation as a problem of constrained optimisation. A new concept of adiabatic network organisation is introduced. Together with an adaptive mechanism of temperature control this allows the network to build a hierarchical multiresolution approximation with preservation of the global optimisation characteristics. A revised problem mapping results in a position invariant local interconnectivity pattern, which makes the network attractive for electronic implementation. The dynamics and performance of the network are illustrated by numerical simulation.
Similar content being viewed by others
References
Powell MJD. Radial basis functions for multivariable interpolation: a review. In: Algorithms for Approximation (JC Mason, MG Cox, eds.), Oxford University Press, Oxford, 1987; 143–176
Michelli CA. Interpolation of scattered data: Distance matrices and conditional positive definite functions. Constructive Approximation 1986; 2: 11–22
Dyn N, Levin D. Iterative solution of systems originating from integral equations and surface interpolation, SIAM Numerical Anal 1983; 20: 377–390
Hecht-Nielsen R. Kolmogorov's mapping neural network existence theorem. In: Proc. IEEE Int Conf on Neural Networks 1987; 11–14
Kolmogorov AN. On the representation of continuous functions of many variables by superposition of continuous functions of one variable and addition. Doklady Akademii Nauk SSSR 1957; 144: 679–681
Sprechner DA. On the structure of continuous functions of several variables. Trans Am Math Soc 1965; 115
Hornik K, Stinchcombe M, White H. Multilayer feedforward networks are universal approximators. Neural Networks 1989; 2: 359–366
Caroll SM, Dickinson BW. Construction of neural nets using the Radon transform. In: Proc. IEEE Int Joint Conf on Neural Networks 1989; 1: 607–611
Cybenko G. Approximations by superpositions of a sigmoidal function. Mathematics of Control, Signals, and Syst 1989; 2: 303–314
Lapedes A, Farber R. Nonlinear signal processing using neural networks: Prediction and system modelling. Los Alamos National Laboratory, Technical Report LA-UR-87, 1987.
Broomhead DS, Lowe D. Multivariable functional interpolation and adaptive networks. Complex Syst 1988; 2: 321–355
Poggio T, Girossi F. Regularization algorithms for learning that are equivalent to multilayer networks. Science 1990; 247: 978–982
Lippmann RP. Pattern classification using neural networks. IEEE Comm Mag 1989; 47–64
Moody J, Darken C. Fast learning in networks of locallytuned processing units. Neural Computation 1989; 1: 281–294
Park J, Sandberg IW. Universal approximation using Radial-Basis-Function networks. Neural Computation 1991; 3: 246–257
Judd S. On the complexity of loading shallow neural networks. J Complexity 1988; 4: 177–192
Widrow G, Hoff ME. Adaptive switching circuits. In: Proc. Western Electronic Show 1960; 4: 47–64
Moody J. Fast learning in multi-resolution hierarchies. In: Advances in neural information processing systems (ID Touretzky, ed.) Morgan Kaufmann, San Mateo, CA, 1989; 29–39
Platt J. A resource-allocating network for function interpolation. Neural Computation 1991; 3: 213–225
Lee S, Kil RM. A Gaussian potential function network with hierarchically self-organizing learning. Neural Networks 1991; 4: 207–224
Tank DW, Hopfield JJ. Simple neural optimization networks: An A/D converter, signal decision network circuit, and a linear programming circuit. IEEE Trans Circuits and Syst 1986; 33: 533–541
Hopfield JJ. Neurons with graded response have collective computational properties like those of two-state neurons. Proc Natl Acad Sci USA — Biophysics 1984; 81: 3088–3092
Peterson C, Anderson JR. A mean field theory learning algorithm for neural networks. Complex Syst 1987; 1: 995–1019
Schumaker LL. Fitting surfaces to scattered data. In: Approximation Theory, vol. 2 (GC Lorentz, CK Chui, LL Schumaker, eds.), Academic Press, New-York, 1976
Sabin MA. Contouring — A review of methods for scattered data. In: Mathematics in Computer Graphics and Design (KW Brodley, ed), Academic Press, London, 1980; 63–86
Franke R. Scattered data interpolation: Tests of some methods. Mathematical Comput 1982; 38: 181–200
Dyn N. Interpolation of scattered data by radial functions. In: Topics in Multivariate Approximation (CK Chui, LL Schumaker, FI Utreras, eds.), Academic Press, New York, 1987: 47–61
Chen S, Cowan CFN, Grant PM. Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Networks 1991; 2: 302–309
Powell MJD. Radial basis functions approximations to polynomials. In: Proc 12th Biennial Numerical Analysis Conf 1987; 223–241
Rippa S. Interpolation and smoothing of scattered data by radial basis functions. M.Sc. Thesis, Tel Aviv University, 1984
Mees AI, Jackson MF, Chua LO. Device modeling by radial basis functions. IEEE Trans Circuits and Syst-I: Fundamental Theory and Applications 1992; 39: 19–27
Rummelhart DE, McClelland L. Parallel Distributed Processing — Explorations in the Microstructure of Cognition, Volume 1: Foundations, MIT Press, Cambridge, MA, 1988
Varga RS. Matrix Iterative Analysis, Automatic Computation Series (G Forsythe, ed), Prentice-Hall, Englewood Cliffs, NJ, 1962
Chua LO, Lin PM. Computer Aided Analysis of Electronic Circuits: Algorithms and computational techiques, Prentice-Hall, Englewood Cliffs, NJ, 1975
Gill PE, Murrray W, Wright MH. Practical Optimization, Academic Press, San Diego, CA, 1981
Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Nat Acad Sci USA — Biophysics 1982; 79: 2554–2558
Edwards SF, Anderson PW. Theory of spin glasses. J Phys F: Metal Phys 1975; 5: 965–974
Alspector J, Zeppenfield T, Luna S. A volatility measure for annealing in feedback neural networks. Neural Comput 1992; 4: 191–195
Kirkpatrick S, Gelatt CD, Vecchi Jr. MP. Optimization by simulated annealing. Science 1983; 220: 671–680
Golub GH, Van Loan CF. Matrix Computations, John Hopkins University Press, Baltimore, MD, 1991
Kamp Y, Hasler M. Recursive Neural Networks for Associative Memory, Wiley, Chichester, 1990
Truyen B, Cornelis J, Vandervelden Ph. Image reconstruction in Electrical Impedance Tomography: A Selfadaptive Neural Network Approach. Proc IEEE EMBS 1993; 1: 72–73
Anderson JA. Neural models with cognitive implications. In: Basic Processes in Reading Perception and Comprehension (D Laberge, SJ Samuels, eds.), Erlbaum, Hillsdale, NJ, 1977; 27–90
Baum EB, Haussler D. What size net gives valid generalization? Neural Comput 1989; 1: 151–160
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Truyen, B., Langloh, N. & Cornelis, J. An adiabatic neural network for RBF approximation. Neural Comput & Applic 2, 69–88 (1994). https://doi.org/10.1007/BF01414351
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF01414351