Abstract
Investigated in this paper are the uniform approximation capabilities of sum-of-product (SOPNN) and sigma-pi-sigma (SPSNN) neural networks. It is proved that the set of functions that are generated by an SOPNN with its activation function in C(ℝ) is dense in \(C(\mathbb{K})\) for any compact \(\mathbb{K}\in \mathbb{R}^N\), if and only if the activation function is not a polynomial. It is also shown that if the activation function of an SPSNN is in C(ℝ), then the functions generated by the SPSNN are dense in \(C(\mathbb{K})\) if and only if the activation function is not a constant.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Attali, J.G., Pagès, G.: Approximations of Functions by a Multilayer Perceptron: a New approach. Neural Networks 10, 1069–1081 (1997)
Chen, T.P., Chen, H., Liu, R.: Approximation Capability in by Multilayer Feedforward Networks and Related Problems. IEEE Transactions on Neural Networks 6(1), 25–30 (1995)
Chen, T.P., Chen, H.: Approximation Capability to Functions of Several Variables, Nonlinear Functionals and Operators by Radial Basis Function Neural Networks. IEEE Transactions on Neural Networks 6(4), 904–910 (1995)
Chen, T.P., Chen, H.: Universal Approximation Capability of RBF Neural Networks with Arbitrary Activation Functions and Its Application to Dynamical Systems. Circuits, Systems and Signal Processing 15(5), 671–683 (1996)
Chen, T.P., Wu, X.W.: Characteristics of Activation Function in Sigma-Pi Neural Networks. Journal of Fudan University 36(6), 639–644 (1997)
Cheney, W., Light, W.: A Course in Approximation Theory. China Machine Press, Beijing (2003)
Chui, C.K., Li, X.: Approximation by Ridge Functions and Neural Networks with One Hidden Layer. Journal of Approximation Theory 70, 131–141 (1992)
Cybenko, G.: Approximation by Superpositions of Sigmoidal Functions. Mathematics of Control, Signals, and Systems 2, 303–314 (1989)
Hornik, K.: Approximation Capabilities of Mutilayer Feedforward Networks. Neural Networks 4, 251–257 (1991)
Jiang, C.H.: The Approximate Problem on Neural Network (in Chinese). Annual of Math. 19A, 295–300 (1998)
Leshno, M., Lin, Y.V., Pinkus, A., Schocen, S.: Multilayer Feedforward Networks with a Non-Polynomial Activation Function Can Approximate Any Function. Neural Networks 6, 861–867 (1993)
Li, C.K.: A Sigma-Pi-Sigma Neural Network (SPSNN). Neural Processing Letters 17, 1–19 (2003)
Liao, Y., Fang, S., Nuttle, H.L.W.: Relaxed Conditions for Radial-Basis Function Networks to Be Universal Approximators. Neural Networks 16, 1019–1028 (2003)
Lin, C.S., Li, C.K.: A Sum-of-Product Neural Network (SOPNN). Neurocomputing 30, 273–291 (2000)
Luo, Y.H., Shen, S.Y.: L p Approximation of Sigma-Pi Neural Networks. IEEE Transactions on Neural Networks 11(6), 1485–1489 (2000)
Park, J., Sandberg, I.W.: Universal Approximation Using Radial- Basis-Function Networks. Neural Computation 3, 246–257 (1991)
Park, J., Sandberg, I.W.: Approximation and Radial-Basis-Function Networks. Neural Computation 5, 305–316 (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Long, J., Wu, W., Nan, D. (2007). Uniform Approximation Capabilities of Sum-of-Product and Sigma-Pi-Sigma Neural Networks. In: Liu, D., Fei, S., Hou, ZG., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4491. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72383-7_130
Download citation
DOI: https://doi.org/10.1007/978-3-540-72383-7_130
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72382-0
Online ISBN: 978-3-540-72383-7
eBook Packages: Computer ScienceComputer Science (R0)