Abstract
It is the aim of this contribution to continue our investigations on a special family of hyperbolic-type linear operators (here, for compactly supported continuous functions on IRn) which immediately can be interpreted as concrete real-time realizations of three-layer feedforward neural networks with sigma-pi units in the hidden layer. To indicate how these results are connected with density results we start with some introductory theorems on this topic. Moreover, we take a detailed look at the complexity of the generated neural networks in order to achieve global ε-accuracy.
Similar content being viewed by others
References
C.K. Chui and X. Li, Approximation by ridge functions and neural networks with one hidden layer, J. Appr. Theory 70 (1992) 131–141.
C.K. Chui and X. Li, Realization of neural networks with one hidden layer, in:Multivariate Approximation: From CAGD to Wavelets, eds. K. Jetter and F.I. Utreras, World Scientific, Singapore, 1993, pp. 77–89.
G. Cybenko, Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals, and Systems 2 (1989) 303–314.
C.L. Giles and T. Maxwell, Learning, invariance, and generalization in high-order neural networks, Applied Optics 26 (1987) 4972–4978.
C.L. Giles, R.D. Griffin and T. Maxwell, Encoding geometric invariances in higher-order neural networks, in:Neural Information Processing Systems — Natural and Synthetic, ed. D.Z. Anderson, American Institute of Physics, New York, 1988, pp. 301–309.
K. Hornik, M. Stinchcombe and H. White, Multilayer feedforward networks are universal approximators, Neural Networks 2 (1989) 359–366.
B. Lenze, Constructive multivariate approximation with sigmoidal functions and applications to neural networks, in:Numerical Methods of Approximation Theory, eds. D. Braess and L.L. Schumaker, ISNM 105, Birkhäuser, Basel, 1992, pp. 155–175.
B. Lenze, Note on a density question for neural networks, Numerical Functional Analysis and Optimization 15 (1994) 909–913.
B. Lenze, How to make sigma-pi neural networks perform perfectly on regular training sets, Neural Networks 7 (1994) 1285–1293.
M. Leshno, V.Y. Lin, A. Pinkus and S. Schocken, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function, Neural Networks 6 (1993) 861–867.
E.J. McShane,Integration, Princeton University Press, Princeton, 1974, 8th edition.
H.N. Mhaskar, Neural networks for optimal approximation of smooth and analytic functions, preprint, 1994.
H.N. Mhaskar and C.A. Micchelli, Approximation by superposition of sigmoidal and radial basis functions, Advances in Appl. Math. 13 (1992) 350–373.
A. Pinkus, Translation and dilation invariant subspaces ofC(ℝd), preprint, 1994.
D.E. Rumelhart and J.L. McClelland,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1 and 2, MIT-Press, Cambridge, Massachusetts, 1986.
S. Saks,Theory of the Integral, Hafner, New York, 1937, 2nd edition.
R.L. Wheeden and A. Zygmund,Measure and Integral, Marcel Dekker, New York, 1977.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Lenze, B. Hyperbolic sigma-pi neural network operators for compactly supported continuous functions. Adv Comput Math 5, 163–172 (1996). https://doi.org/10.1007/BF02124741
Issue Date:
DOI: https://doi.org/10.1007/BF02124741