Skip to main content

Uniform Approximation Capabilities of Sum-of-Product and Sigma-Pi-Sigma Neural Networks

  • Conference paper
Book cover Advances in Neural Networks – ISNN 2007 (ISNN 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4491))

Included in the following conference series:

Abstract

Investigated in this paper are the uniform approximation capabilities of sum-of-product (SOPNN) and sigma-pi-sigma (SPSNN) neural networks. It is proved that the set of functions that are generated by an SOPNN with its activation function in C(ℝ) is dense in \(C(\mathbb{K})\) for any compact \(\mathbb{K}\in \mathbb{R}^N\), if and only if the activation function is not a polynomial. It is also shown that if the activation function of an SPSNN is in C(ℝ), then the functions generated by the SPSNN are dense in \(C(\mathbb{K})\) if and only if the activation function is not a constant.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Attali, J.G., Pagès, G.: Approximations of Functions by a Multilayer Perceptron: a New approach. Neural Networks 10, 1069–1081 (1997)

    Article  Google Scholar 

  2. Chen, T.P., Chen, H., Liu, R.: Approximation Capability in by Multilayer Feedforward Networks and Related Problems. IEEE Transactions on Neural Networks 6(1), 25–30 (1995)

    Article  Google Scholar 

  3. Chen, T.P., Chen, H.: Approximation Capability to Functions of Several Variables, Nonlinear Functionals and Operators by Radial Basis Function Neural Networks. IEEE Transactions on Neural Networks 6(4), 904–910 (1995)

    Article  Google Scholar 

  4. Chen, T.P., Chen, H.: Universal Approximation Capability of RBF Neural Networks with Arbitrary Activation Functions and Its Application to Dynamical Systems. Circuits, Systems and Signal Processing 15(5), 671–683 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  5. Chen, T.P., Wu, X.W.: Characteristics of Activation Function in Sigma-Pi Neural Networks. Journal of Fudan University 36(6), 639–644 (1997)

    MathSciNet  MATH  Google Scholar 

  6. Cheney, W., Light, W.: A Course in Approximation Theory. China Machine Press, Beijing (2003)

    MATH  Google Scholar 

  7. Chui, C.K., Li, X.: Approximation by Ridge Functions and Neural Networks with One Hidden Layer. Journal of Approximation Theory 70, 131–141 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  8. Cybenko, G.: Approximation by Superpositions of Sigmoidal Functions. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  9. Hornik, K.: Approximation Capabilities of Mutilayer Feedforward Networks. Neural Networks 4, 251–257 (1991)

    Article  Google Scholar 

  10. Jiang, C.H.: The Approximate Problem on Neural Network (in Chinese). Annual of Math. 19A, 295–300 (1998)

    Google Scholar 

  11. Leshno, M., Lin, Y.V., Pinkus, A., Schocen, S.: Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function. Neural Networks 6, 861–867 (1993)

    Article  Google Scholar 

  12. Li, C.K.: A Sigma-Pi-Sigma Neural Network (SPSNN). Neural Processing Letters 17, 1–19 (2003)

    Article  Google Scholar 

  13. Liao, Y., Fang, S., Nuttle, H.L.W.: Relaxed Conditions for Radial-Basis Function Networks to Be Universal Approximators. Neural Networks 16, 1019–1028 (2003)

    Article  MATH  Google Scholar 

  14. Lin, C.S., Li, C.K.: A Sum-of-Product Neural Network (SOPNN). Neurocomputing 30, 273–291 (2000)

    Article  Google Scholar 

  15. Luo, Y.H., Shen, S.Y.: L p Approximation of Sigma-Pi Neural Networks. IEEE Transactions on Neural Networks 11(6), 1485–1489 (2000)

    Article  Google Scholar 

  16. Park, J., Sandberg, I.W.: Universal Approximation Using Radial- Basis-Function Networks. Neural Computation 3, 246–257 (1991)

    Article  Google Scholar 

  17. Park, J., Sandberg, I.W.: Approximation and Radial-Basis-Function Networks. Neural Computation 5, 305–316 (1993)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Long, J., Wu, W., Nan, D. (2007). Uniform Approximation Capabilities of Sum-of-Product and Sigma-Pi-Sigma Neural Networks. In: Liu, D., Fei, S., Hou, ZG., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4491. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72383-7_130

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72383-7_130

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72382-0

  • Online ISBN: 978-3-540-72383-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics