Skip to main content
Log in

Hyperbolic sigma-pi neural network operators for compactly supported continuous functions

  • Published:
Advances in Computational Mathematics Aims and scope Submit manuscript

Abstract

It is the aim of this contribution to continue our investigations on a special family of hyperbolic-type linear operators (here, for compactly supported continuous functions on IRn) which immediately can be interpreted as concrete real-time realizations of three-layer feedforward neural networks with sigma-pi units in the hidden layer. To indicate how these results are connected with density results we start with some introductory theorems on this topic. Moreover, we take a detailed look at the complexity of the generated neural networks in order to achieve global ε-accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. C.K. Chui and X. Li, Approximation by ridge functions and neural networks with one hidden layer, J. Appr. Theory 70 (1992) 131–141.

    Article  Google Scholar 

  2. C.K. Chui and X. Li, Realization of neural networks with one hidden layer, in:Multivariate Approximation: From CAGD to Wavelets, eds. K. Jetter and F.I. Utreras, World Scientific, Singapore, 1993, pp. 77–89.

    Google Scholar 

  3. G. Cybenko, Approximation by superpositions of a sigmoidal function, Mathematics of Control, Signals, and Systems 2 (1989) 303–314.

    Google Scholar 

  4. C.L. Giles and T. Maxwell, Learning, invariance, and generalization in high-order neural networks, Applied Optics 26 (1987) 4972–4978.

    Google Scholar 

  5. C.L. Giles, R.D. Griffin and T. Maxwell, Encoding geometric invariances in higher-order neural networks, in:Neural Information Processing Systems — Natural and Synthetic, ed. D.Z. Anderson, American Institute of Physics, New York, 1988, pp. 301–309.

    Google Scholar 

  6. K. Hornik, M. Stinchcombe and H. White, Multilayer feedforward networks are universal approximators, Neural Networks 2 (1989) 359–366.

    Article  Google Scholar 

  7. B. Lenze, Constructive multivariate approximation with sigmoidal functions and applications to neural networks, in:Numerical Methods of Approximation Theory, eds. D. Braess and L.L. Schumaker, ISNM 105, Birkhäuser, Basel, 1992, pp. 155–175.

    Google Scholar 

  8. B. Lenze, Note on a density question for neural networks, Numerical Functional Analysis and Optimization 15 (1994) 909–913.

    Google Scholar 

  9. B. Lenze, How to make sigma-pi neural networks perform perfectly on regular training sets, Neural Networks 7 (1994) 1285–1293.

    Article  Google Scholar 

  10. M. Leshno, V.Y. Lin, A. Pinkus and S. Schocken, Multilayer feedforward networks with a nonpolynomial activation function can approximate any function, Neural Networks 6 (1993) 861–867.

    Google Scholar 

  11. E.J. McShane,Integration, Princeton University Press, Princeton, 1974, 8th edition.

    Google Scholar 

  12. H.N. Mhaskar, Neural networks for optimal approximation of smooth and analytic functions, preprint, 1994.

  13. H.N. Mhaskar and C.A. Micchelli, Approximation by superposition of sigmoidal and radial basis functions, Advances in Appl. Math. 13 (1992) 350–373.

    Article  Google Scholar 

  14. A. Pinkus, Translation and dilation invariant subspaces ofC(ℝd), preprint, 1994.

  15. D.E. Rumelhart and J.L. McClelland,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vols. 1 and 2, MIT-Press, Cambridge, Massachusetts, 1986.

    Google Scholar 

  16. S. Saks,Theory of the Integral, Hafner, New York, 1937, 2nd edition.

    Google Scholar 

  17. R.L. Wheeden and A. Zygmund,Measure and Integral, Marcel Dekker, New York, 1977.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lenze, B. Hyperbolic sigma-pi neural network operators for compactly supported continuous functions. Adv Comput Math 5, 163–172 (1996). https://doi.org/10.1007/BF02124741

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02124741

Keywords

Navigation