Abstract
In this paper we described new results on the complexity of computing dichotomies and dichotomies on examples particularly on the number of units in the hidden layers. Traditionnally the number of units is bounded by functions of the number of examples. We have introduced a new parameter: the distance between the classes. These two parameters are complementary and it is still unknown if another parameters could be used. The bounds that we derived are not tight and should be improved.
We have also shown that the use of second hidden layer could reduce the total number of hidden units. What can be proved if we add more layers? More generally the relationship between the capabilities of multilayer artificial neural networks and the number of layers and number of hidden units is still a completely open problem.
This article was processed using the LATEX macro package with LMAMULT style
Preview
Unable to display preview. Download preview PDF.
References
BAUM, E.B. (1988), On the capabilities of multilayer perceptions, Journal of Complexity 4, 193–215.
BLUM, E.K., LI, L.K. (1991), Approximatiom theory and feedforward networks, Neural Networks 4, 511–516
CHESTER, D. (1991), The generalization capabilities of piecewise linear nets, oral presentation at the DIMACS Workshop on Theoretical issues in neural nets, Rutgers Univ., may 20–24
COVER, T.M. (1965), Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition, IEEE Trans. Electron. Comput. 14, 326–334.
CYBENKO, G. (1988), Continuous valued neural networks: approximation theoretic results, in: Virginia Amer. Stat. Assn., Alexandria, editor, Comp. Sc. and Stat., proc.of the 20th Symp. on the Interface, 174–183.
FUNAHASHI, K. (1989), On the approximate realization of continuous mappings by neural networks, Neural Networks, 2, 183–192.
HORNIK, K., STINCHCOMBE, M., WHITE H. (1989), Multilayer feedforward networks are universal approximators, Neural Networks, 2, 359–366.
HORNIK, K., STINCHCOMBE, M., WHITE H. (1990), Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks, Neural Networks, 3, 551–560.
KOIRAN, P. (1991), Approximating of mappings and application to translational invariant networks, proc. of IJCNN-Singapore '91, 3, 2294–2298.
LIPPMANN, R. (1987), An introduction to computing with neural nets, IEEE ASSP Magazine, 4–22
MAKHOUL, J. (1991), Partitioning capabilities of two-layer neural networks, oral presentation at the DIMACS Workshop on Theoretical issues in neural nets, Rutgers Univ., may 20–24
MCCULLOCH, W., PITTS, W. (1943), A logical calculus of the ideas immanent in nervous activity, Bulletin of Mathematical Biophysics 5, 115–133
MCCULLOCH, W., PITTS, W. (1947), How we know universals, Bulletin of Mathematical Biophysics 9, 127–147
MINSKY, M., PAPERT, S. (1969), Perceptions, an introduction to computational geometry, MIT Press
MINSKY, M., Papert, S. (1988), Perceptrons — Expanded Edition, MIT Press
NILSSON, N.J. (1965), The Mathematical Foundations of Learning Machines, Morgan Kaufmann
ROSENBLATT, F. (1962), Principles of neurodynamics, SPARTAN New York
RUMELHART, D.E., HINTON, G.E., WILLIAMS, R.J. (1986), Learning internal representations by error propagation, in: Parallel Distributed Processing, vol.1, 318–362, MIT Press
SCHLÄFLI, L. (1950), Gesammelte Mathematische Abhandlungen I, Verlag Birkhäuser, Basel (Switzerland), 209–212
SONTAG, E.D. (1990), Feedback stabilization using two-hidden-layer nets, Report SYCON-90-11, Rutgers Center for Systems and Control
VENKATESH, S.S. (1991), Probabilistic capacity and links to distribution dependent learning, oral presentation at the DIMACS Workshop on Theoretical issues in neural nets, Rutgers Univ., may 20–24
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1992 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Cosnard, M., Koiran, P., Paugam-Moisy, H. (1992). Complexity issues in neural network computations. In: Simon, I. (eds) LATIN '92. LATIN 1992. Lecture Notes in Computer Science, vol 583. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0023854
Download citation
DOI: https://doi.org/10.1007/BFb0023854
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-55284-0
Online ISBN: 978-3-540-47012-0
eBook Packages: Springer Book Archive