Abstract
We introduce a model of nondeterministic hybrid recurrent neural networks – made up of Boolean input and output cells as well as internal sigmoid neurons, and equipped with the possibility to have their synaptic weights evolve over time, in a nondeterministic manner. When subjected to some infinite input stream and some specific synaptic evolution, the networks necessarily exhibit some attractor dynamics in their Boolean output cells, and accordingly, recognize some specific neural \(\omega \) -languages. The expressive power of these networks is measured via the topological complexity of their underlying neural \(\omega \)-languages. In this context, we prove that the two models of rational-weighted and real-weighted nondeterministic hybrid neural networks are computationally equivalent, and recognize precisely the set of all analytic neural \(\omega \)-languages. They are therefore strictly more expressive than the nondeterministic Büchi and Muller Turing machines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
By contrast, a deterministic Ev-RNN has only one possible evolution for its synaptic weights, and hence corresponds to a nondeterministic Ev-RNN where the set E is reduced to a singleton.
- 2.
The results of the paper hold equally true even with E taken as \(\mathbf {\Pi ^0_2}\).
- 3.
We recall that the preimage by a Baire class 1 function of a set in \(\mathbf {\Sigma ^0_n}\) (resp. \(\mathbf {\Pi ^0_n}\)) is in \(\mathbf {\Sigma }^\mathbf{0}_{\mathbf{n+1}}\) (resp. \(\mathbf {\Pi }^\mathbf{0}_{\mathbf{n+1}}\)).
References
Cabessa, J.: Interactive evolving recurrent neural networks are super-Turing. In: Filipe, J., Fred, A.L.N. (eds.) Proceedings of ICAART, pp. 328–333. SciTePress (2012)
Cabessa, J., Siegelmann, H.T.: Evolving recurrent neural networks are super-Turing. In: Proceedings of IJCNN 2011, pp. 3200–3206. IEEE (2011)
Cabessa, J., Siegelmann, H.T.: The computational power of interactive recurrent neural networks. Neural Comput. 24(4), 996–1019 (2012)
Cabessa, J., Siegelmann, H.T.: The super-Turing computational power of plastic recurrent neural networks. Int. J. Neural. Syst. 24(8), 1450029 (2014)
Cabessa, J., Villa, A.E.P.: Computational capabilities of recurrent neural networks based on their attractor dynamics. In: Proceedings of IJCNN 2015. IEEE (to appear, 2015) (accepted)
Cabessa, J., Villa, A.E.P.: A hierarchical classification of first-order recurrent neural networks. In: Dediu, A.-H., Fernau, H., Martín-Vide, C. (eds.) LATA 2010. LNCS, vol. 6031, pp. 142–153. Springer, Heidelberg (2010)
Cabessa, J., Villa, A.E.P.: The expressive power of analog recurrent neural networks on infinite input streams. Theor. Comput. Sci. 436, 23–34 (2012)
Cabessa, J., Villa, A.E.P.: Recurrent neural networks - a natural model of computation beyond the Turing limits. In: Rosa, A.C., et al., (ed.) Proceedings of IJCCI 2012, pp. 594–599. SciTePress (2012)
Cabessa, J., Villa, A.E.P.: The super-turing computational power of interactive evolving recurrent neural networks. In: Mladenov, V., Koprinkova-Hristova, P., Palm, G., Villa, A.E.P., Appollini, B., Kasabov, N. (eds.) ICANN 2013. LNCS, vol. 8131, pp. 58–65. Springer, Heidelberg (2013)
Cabessa, J., Villa, A.E.P.: An attractor-based complexity measurement for boolean recurrent neural networks. PloS ONE 9(4), e94204+ (2014)
Cabessa, J., Villa, A.E.P.: Interactive evolving recurrent neural networks are super-turing universal. In: Wermter, S., Weber, C., Duch, W., Honkela, T., Koprinkova-Hristova, P., Magg, S., Palm, G., Villa, A.E.P. (eds.) ICANN 2014. LNCS, vol. 8681, pp. 57–64. Springer, Heidelberg (2014)
Kechris, A.S.: Classical Descriptive Set Theory. Graduate Texts in Mathematics, vol. 156. Springer-Verlag, New York (1995)
Kleene, S.C.: Representation of events in nerve nets and finite automata. In: Shannon, C., McCarthy, J. (eds.) Automata Studies, pp. 3–41. Princeton University Press, Princeton, NJ (1956)
McCulloch, W.S., Pitts, W.: A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 5, 115–133 (1943)
Minsky, M.L.: Computation: Finite and Infinite Machines. Prentice-Hall Inc., Englewood Cliffs (1967)
Neumann, J.V.: The Computer and the Brain. Yale University Press, New Haven (1958)
Perrin, D., Pin, J.-E.: Infinite Words - Automata, Semigroups, Logic and Games. Pure and Applied Mathematics, vol. 141. Elsevier, San Diego (2004)
Siegelmann, H.T.: Neural Networks and Analog Computation: Beyond the Turing Limit. Birkhauser Boston Inc., Cambridge (1999)
Siegelmann, H.T., Sontag, E.D.: Analog computation via neural networks. Theor. Comput. Sci. 131(2), 331–360 (1994)
Siegelmann, H.T., Sontag, E.D.: On the computational power of neural nets. J. Comput. Syst. Sci. 50(1), 132–150 (1995)
Staiger, L.: \(\omega \)-languages. In: Rozenberg, G., Salomaa, A. (eds.) Handbook of Formal Languages: Beyond Words, vol. 3, pp. 339–387. Springer-Verlag, New York (1997)
Thomas, W.: Automata on infinite objects. In: van Leeuwen, J. (ed.) Handbook of Theoretical Computer Science: Formal Models and Semantics, vol. B, pp. 133–192. Elsevier and MIT Press, Amsterdam (1990)
Turing, A.M.: Intelligent machinery. Technical report, National Physical Laboratory, Teddington, UK (1948)
Vaadia, E., Haalman, I., Abeles, M., Bergman, H., Prut, Y., Slovin, H., Aertsen, A.: Dynamics of neuronal interactions in monkey cortex in relation to behavioural events. Nature 373(6514), 515–518 (1995)
Villa, A.E.P., Tetko, I.V., Hyland, B., Najem, A.: Spatiotemporal activity patterns of rat cortical neurons predict responses in a conditioned task. Proc. Natl. Acad. Sci. U.S.A. 96(3), 1106–1111 (1999)
Wagner, K.: On \(\omega \)-regular sets. Inf. Control 43(2), 123–177 (1979)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Cabessa, J., Duparc, J. (2015). Expressive Power of Non-deterministic Evolving Recurrent Neural Networks in Terms of Their Attractor Dynamics. In: Calude, C., Dinneen, M. (eds) Unconventional Computation and Natural Computation. UCNC 2015. Lecture Notes in Computer Science(), vol 9252. Springer, Cham. https://doi.org/10.1007/978-3-319-21819-9_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-21819-9_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-21818-2
Online ISBN: 978-3-319-21819-9
eBook Packages: Computer ScienceComputer Science (R0)