Abstract
The paper considers Ito's results on the approximation capability of layered neural networks with sigmoid units in two layers. First of all the paper recalls one of Ito's main results. Then the results of Ito regarding Heaviside function as sigmoid functions are extended using a signum function. For Heaviside functions a layered neural network implementation is presented that is also valid for signum functions. The focus of paper is on the implementation of Ito's appoximators as four layer feed-forward neural networks.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
Cybenko G. 1989, Approximation by superpositions of a sigmoidal function. Math. Control Signal System 2. 303–314
Funahashi K. 1989, On the approximate realisation of continuous mapping by neural networks. Neural Networks 2. 183–192
Girosi F. and Poggio T. 1989, Representation properties of network: Kolmogorov's theorem is irrelevant. Neural Computation 1. 456–469
Hecht-Nielsen R. 1987, Kolmogorov's mapping neural network existence theorem. IEEE First Conf. Neural Networks III. 11–13
Hecht-Nielsen R. 1989. Theory of the back propagation neural network, '89 IJCNN Proc. I. 593–605
Hornik K. 1991. Approximation capabilities of multilayer feedforward networks, Neural Networks 4. 251–257
Hornik K., Stinchcombe M. and White H. 1989. Multilayer feedforward networks are universal approximators. Neural Networks 2. 359–366
Hornik K., Stinchcombe M. and White H. 1990, Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks 3. 551–560
Ito Y. 1991a. Representation of Functions by superposition of a step or sigmoid function and their applications to neural network theory. Neural Networks 4. 385–394.
Ito Y. 1991b, Approximation of functions on a compact set by finite sums of a sigmoid function without scaling. Neural Networks 4. 817–826
Ito Y. 1992, Approximation of continuous functions of R by linear combinations of shifted rotations of a sigmoid function with and without scaling, Neural Networks 5. 105–115.
Ito Y. 1993, Approximations of differentiable functions and their derivatives on compact set by neural networks. Math. Scient. 18. 11–19
Ito Y. 1994, Approximation capability of layered neural networks with sigmoid units on two layers. Neural Computation 6. 1233–1243
Kolmogorov A. N. 1957, On the representations of continuous functions of many variables by superpositions of continuous functions of one variable and addition. Dokl. Akad. Nauk USSR 114 (5). 953–956.
Kurkova V. 1991. Kolmogorov's theorem is relevant. Neural Computation 3. 617–622
Kurkova V. 1992. Kolmogorov's theorem and multilayer neural networks. Neural Networks 5. 501–506
Stinchcombe M. and White H. 1989. Universal approximation using feedforward networks with non-sigmoid hidden layer activation functions. '89 IJCNN, Proc. I, 613–617.
Cardaliaguet P., Euvrard G., 1992, Approximation of a Function and its Derivatives with a Neural Network, Neural Networks, Vol 5 pp 207–220
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1997 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ciuca, I., Ware, J.A. (1997). Layered neural networks as universal approximators. In: Reusch, B. (eds) Computational Intelligence Theory and Applications. Fuzzy Days 1997. Lecture Notes in Computer Science, vol 1226. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-62868-1_133
Download citation
DOI: https://doi.org/10.1007/3-540-62868-1_133
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-62868-2
Online ISBN: 978-3-540-69031-3
eBook Packages: Springer Book Archive