Abstract
The echo-state condition names an upper limit for the hidden layer connectivity in recurrent neural networks. If the network is below this limit there is an injective, continuous mapping from the recent input history to the internal state of the network. Above the network becomes chaotic, the dependence on the initial state of the network may never be washed out. I focus on the biological relevance of echo state networks with a critical connectivity strength at the separation line between these two conditions and discuss some related biological findings, i.e. there is evidence that the neural connectivity in cortical slices is tuned to a critical level. In addition, I propose a model that makes use of a special learning mechanism within the recurrent layer and the input connectivity. Results show that after adaptation indeed traces of single unexpected events stay for a longer time period than exponential in the network.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Beggs, J., Plenz, D.: Neuronal avalanches in neocortical curcuits. J. Neurosci. 24(22), 5216–5229 (2004)
Levina, A., Herrmann, M.: Dynamical synapses give rise to a power-law distribution of neuronal avalanches. In: Weiss, Y., Schoellkopf, B., Patt, J. (eds.) Advances in Neural Information Processing Systems, vol. 18, pp. 771–778. MIT Press, Cambridge (2006)
Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009), doi:10.1016/j.cosrev.2009.03.005.
Nikolić, D., Häusler, S., Singer, W., Maass, W.: Distributed fading memory for stimulus properties in the primary visual cortex. PLoS Biol 7(12), e1000260 (2009), doi:10.1371/journal.pbio.1000260.
Hajnal, M.A., Lőrincz, A.: Critical echo state networks. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. Part I. NCS, vol. 4131, pp. 658–667. Springer, Heidelberg (2006)
Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)
Obst, O., Boedecker, J.: Guided self-organization of input-driven recurrent neural networks. In: Guided Self-Organization, pp. 319–340 (2014)
van Vreeswick, C., Sompolinsky, H.: Chaotic balanced state in a model of cortical circuits. Neural Computation 10, 1321–1371 (1998)
Jaeger, H.: The ’echo state’ approach to analysing and training recurrent neural networks. In: GMD Report 148, GMD German National Research Insitute for Computer Science (2001), http://www.gmd.de/People/Herbert.Jaeger/Publications.html
Jaeger, H.: Adaptive nonlinear system identification with echo state networks. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15 (NIPS 2002). MIT Press, Cambridge (2003)
Buechner, M., Young, P.: A Tighter Bound for the Echo State Property. IEEE Transaction on Neural Networks 17(3), 820–824 (2006)
Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Networks 35, 1–20 (2012)
Steil, J.: Online stability of backpropagation-decorrelation recurrent learning. Neurocomputing 69(79), 642–650 (2006), http://dx.doi.org/10.1016/j.neucom.2005.12.012
Mayer, N.M., Browne, M.: Self-prediction in echo state networks. In: Proceedings of The First International Workshop on Biological Inspired Approaches to Advanced Information Technology (BioAdIt 2004), Lausanne (2004)
Mayer, N.M., Obst, O., Yu-Chen, C.: Time series causality inference using echo state networks. In: Vigneron, V., Zarzoso, V., Moreau, E., Gribonval, R., Vincent, E. (eds.) LVA/ICA 2010. LNCS, vol. 6365, pp. 279–286. Springer, Heidelberg (2010)
Herbert, J.S., Pascalis, O.: Memory development. In: Slator, A., Lewis, M., eds.: Introduction to Infant Development. Oxford University Press (2007)
Uhlig, M., Levina, A., Geisel, T., Herrmann, J.M.: Critical dynamics in associative memory networks. Front Comput. Neurosci. 7(87) (2013)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Mayer, N.M. (2014). Adaptive Critical Reservoirs with Power Law Forgetting of Unexpected Input Sequences. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-11179-7_7
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11178-0
Online ISBN: 978-3-319-11179-7
eBook Packages: Computer ScienceComputer Science (R0)