Abstract
Neural networks have attracted much attention lately as a powerful tool of automatic learning. Of particular interest is the class of recurrent networks which allow for loops and cycles and thus give rise to dynamical systems, to flexible behavior, and to computation. This paper reviews the recent findings that mathematically quantify the computational power and dynamic capabilities of recurrent neural networks. The appeal of the network as a possible standard model of analog computation also will be discussed.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
J. L. Balcázar, J. Díaz, and J. Gabarró. Structural Complexity, volume I and II. EATCS Monographs in Theoretical Computer Science, Springer-Verlag, Berlin, 1988–1990.
J. L. Balcázar, R. Gavaldà, H.T. Siegelmann, and E. D. Sontag. Some structural complexity aspects of neural computation. In 8th Ann. IEEE conf. on Structure in Complexity Theory, pages 253–265, San Diego, CA, May 1993.
A.R. Barron. Neural net approximation. In Proc. Seventh Yale Workshop on Adaptive and Learning Systems, pages 69–72, Yale University, 1992.
E.B. Baum and D. Haussler. What size net gives valid generalization? Neural Computation, 1:151–160, 1989.
L. Blum, M. Shub, and S. Smale. On a theory of computation and complexity over the real numbers: NP completeness, recursive functions, and universal machines. Bull. A.M.S., 21:1–46, 1989.
G. Cybenko. Approximation by superpositions of a sigmoidal function. Math. Control, Signals, and Systems, 2:303–314, 1989.
J.A. Franklin. On the approximate realization of continuous mappings by neural networks. Neural Networks, 2:183–192, 1989.
S. Franklin and M. Garzon. Neural computability. In O. M. Omidvar, editor, Progress In Neural Networks, pages 128–144. Ablex, Norwood, NJ, 1990.
M. Garzon and S. Franklin. Neural computability. In Proc. 3rd Int. Joint Conf. Neural Networks, volume II, pages 631–637, 1989.
C.L. Giles, B.G. Home, and T. Lin. Learning a class of large finite state machines with a recurrent neural network. Neural Networks, 1995. In press.
R. Hartley and H. Szu. A comparison of the computational power of neural network models. In Proc. IEEE Conf. Neural Networks, pages 17–22, 1987.
S. Haykin. Neural Networks: A Comprehensive Foundation. IEEE Press, New York, 1994.
J.W. Hong. On connectionist models. On Pure and Applied Mathematics, 41, 1988.
J.E. Hopcroft and J.D. Ullman. Introduction to Automata Theory, Languages, and Computation. Addison-Wesley Publishing Company, Inc., Reading, MA, 1979.
K. Hornik. Approximation capabilities of multilayer feedforward networks. Neural Networks, 4:251–257, 1991.
K. Hornik, M. Stinchcombe, and H. White. Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks. Neural Networks, 3: 551–560, 1990.
J. Kilian and H.T. Siegelmann. On the power of sigmoid neural networks. In Proc. Sixth ACM Workshop on Computational Learning Theory, Santa Cruz, July 1993.
S. C. Kleene. Representation of events in nerve nets and finite automata. In C.E. Shannon and J. McCarthy, editors, Automata Studies, pages 3–41. Princeton Univ. Press, 1956.
P. Koiran, M. Cosnard, and M. Garzon. Computability with low-dimensional dynamical systems. Theoretical Computer Science, 132:113–128, 1994.
W. Maass, G. Schnitger, and E.D. Sontag. On the computational power of sigmoid versus boolean threshold circuits. In Proc. 32nd Ann. IEEE Symp. Foundations of Computer Science, pages 767–776, 1991.
M. Matthews. On the uniform approximation of nonlinear discrete-time fadingmemory systems using neural network models. Technical Report Ph.D. Thesis, ETH No. 9635, E.T.H. Zurich, 1992.
W. S. McCulloch and W. Pitts. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys, 5:115–133, 1943.
C.B. Miller and C.L. Giles. Experimental comparison of the effect of order in recurrent neural networks. International Journal of Pattern Recognition and Artificial Intelligence, 7(4):849–872, 1993. Special Issue on Neural Networks and Pattern Recognition, editors: I. Guyon, P.S.P. Wang.
J. Moody, A. Levin, and S. Rehfuss. Predicting the U. S. index of industrial production. Neural Network World, 3(6):791–794, 1993.
P. Orponen. On the computational power of discrete Hopfield nets. A. Lingas, R. Karlsson, S. Carlsson (Eds.), Automata, Languages and Programming, Proc. 20th Intern. Symposium (ICALP'93), Lecture Notes in Computer Science, Vol. 700, Springer-Verlag, Berlin, 1993, pp. 215–226.
R. Penrose. The Emperor's New Mind. Oxford University Press, Oxford, 1989.
J. B. Pollack. On Connectionist Models of Natural Language Processing. PhD thesis, Computer Science Dept, Univ. of Illinois, Urbana, 1987.
M. M. Polycarpou and P.A. Ioannou. Identification and control of nonlinear systems using neural network models: Design and stability analysis. Technical Report 91-09-01, Department of EE/Systems, USC, Los Angeles, Sept 1991.
H. T. Siegelmann. On the computational power of probabilistic and faulty neural networks. In S. Abiteboul, E. Shamir (Eds.), Automata, Languages and Programming, Proc. 21st Intern. Symposium (ICALP 94), Lecture Notes in Computer Science, Vol. 820, Springer-Verlag, Berlin, 1994, pp. 23–33.
H. T. Siegelmann. Computation beyond the Turing limit. SCIENCE, 268(5210): 545–548, April 1995.
H. T. Siegelmann and E. D. Sontag. Turing computability with neural nets. Appl. Math. Lett., 4(6):77–80, 1991.
H. T. Siegelmann and E. D. Sontag. Analog computation via neural networks. Theoretical Computer Science, Vol. 131: 331–360, 1994.
H. T. Siegelmann and E. D. Sontag. On computational power of neural networks. J. Comp. Syst. Sci, 50(1):132–150, 1995. Previous version appeared in Proc. Fifth ACM Workshop on Computational Learning Theory, pages 440–449, Pittsburgh, July 1992.
H.T. Siegelmann, B.G. Horne, and C.L. Giles. Computational capabilities of recurrent narx neural networks. Technical Report UMIACS-TR-95-12 and CS-TR-3408, Institute for Advanced Computer Studies, University of Maryland, College Park, Maryland, 1995.
E.D. Sontag. Neural nets as systems models and controllers. In Proc. Seventh Yale Workshop on Adaptive and Learning Systems, pages 73–79, Yale University, 1992.
M. Stinchcombe and H. White. Approximating and learning unknown mappings using multilayer feedforward networks with bounded weights. In Proceedings of the International Joint Conference on Neural Networks, IEEE, Vol. 3, pages 7–16, 1990.
H.J. Sussmann. Uniqueness of the weights for minimal feedforward nets with a given input-output map. Neural Networks, 5: 589–593, 1992.
V. Tresp, J. Moody, and W.R. Delong. Neural network modeling of physiological processes. In T. Petsche, M. Kearns, S. Hanson, and R. Rivest, editors, Computational Learning Theory and Natural Learning Systems — vol. 2. MIT Press, Cambridge MA, 1993: 363–378.
J. Utans, J. Moody, S. Rehfuss, and H. T. Siegelmann. Selecting input variables via sensitivity analysis: Application to predicting the U. S. business cycle. In IEEE Computational Intelligence in Financial Engineering Conference, New York, April 1995: 118–122.
D. Wolpert. A computationally universal field computer which is purely linear. Technical Report LA-UR-91-2937, Los Alamos National Laboratory, 1991.
W. Pitts, W.S. McCulloch. A logical calculus of ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5:115–133, 1943.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1995 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Siegelmann, H.T. (1995). Recurrent neural networks. In: van Leeuwen, J. (eds) Computer Science Today. Lecture Notes in Computer Science, vol 1000. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0015235
Download citation
DOI: https://doi.org/10.1007/BFb0015235
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-60105-0
Online ISBN: 978-3-540-49435-5
eBook Packages: Springer Book Archive