Abstract
Real-valued functions with multiple boolean variables are represented by one-hidden-layer Heaviside perceptron networks with an exponential number of hidden units. We derive upper bounds on approximation error using a given number n of hidden units. The bounds on error axe of the form \(\frac{c}{\sqrt{n}}\) where c depends on certain norms of the function being approximated and n is the number of hidden units. We show examples of functions for which these norms grow polynomially and exponentially with increasing input dimension.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
A.R. Barron. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transactions on Information Theory, 39:930–945, 1993.
A. Hajnal, W. Maass, P. Pudlák, M. Szegedy, and G. Turán. Threshold circuits of bounded depth. In Proceedings of the 28th Annual Symposium on Foundations of Computer Science, pages 99–110. IEEE Computer Society Press, 1987.
K. Hlaváčková and V. Kůrková. Rates of approximation of real-valued Boolean functions by neural networks. In Proceedings of ESANN’96, pages 167–172. Bruges 1996, Belgium, 1996.
Y. Ito. Finite mapping by neural networks and truth functions. Math. Scientist, 17:69–77, 1992.
L.K. Jones. A simple lemma on greedy approximation in hilbert space and convergence rates for projection pursuit regression and neural network training. Annals of Statistics, 20:601–613, 1992.
V. Kůrková. Dimension-independent rates of approximation by neural networks. Birkhauser, 1997. in press.
V. Kůrková, P. C. Kainen, and V. Kreinovich. Estimates of the number of hidden units and variation with respect to half-spaces. Neural Networks, 1997. in press.
E. Kushilevitz and Y. Mansour. Learning decision trees using the fourier spectrum. In Proceedings of 23rd STOC, pages 455–464, 1991.
H.N. Mhaskar and C.A. Micchelli. Dimension-independent bounds on the degree of approximation by neural networks. IBM Journal of Research and Development, 38(3), May 1994.
O.S. Rothaus. On “bent” functions. J. Combin. Theory, Ser. A, 20:300–305, 1976.
T.J. Sejnowski and C. Rosenberg. Parallel networks that learn to pronounce english text. Complex Systems, 1:145–168, 1987.
H.J. Weaver. Applications of discrete and continuous Fourier analysis. John Wiley, New York, 1983.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag Wien
About this paper
Cite this paper
Hlaváčková, K., Kůrková, V., Savický, P. (1998). Upper Bounds on the Approximation Rates of Real-valued Boolean Functions by Neural Networks. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6492-1_109
Download citation
DOI: https://doi.org/10.1007/978-3-7091-6492-1_109
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83087-1
Online ISBN: 978-3-7091-6492-1
eBook Packages: Springer Book Archive