Skip to main content

Adaptive Critical Reservoirs with Power Law Forgetting of Unexpected Input Sequences

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2014 (ICANN 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8681))

Included in the following conference series:

  • 4277 Accesses

Abstract

The echo-state condition names an upper limit for the hidden layer connectivity in recurrent neural networks. If the network is below this limit there is an injective, continuous mapping from the recent input history to the internal state of the network. Above the network becomes chaotic, the dependence on the initial state of the network may never be washed out. I focus on the biological relevance of echo state networks with a critical connectivity strength at the separation line between these two conditions and discuss some related biological findings, i.e. there is evidence that the neural connectivity in cortical slices is tuned to a critical level. In addition, I propose a model that makes use of a special learning mechanism within the recurrent layer and the input connectivity. Results show that after adaptation indeed traces of single unexpected events stay for a longer time period than exponential in the network.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Beggs, J., Plenz, D.: Neuronal avalanches in neocortical curcuits. J. Neurosci. 24(22), 5216–5229 (2004)

    Article  Google Scholar 

  2. Levina, A., Herrmann, M.: Dynamical synapses give rise to a power-law distribution of neuronal avalanches. In: Weiss, Y., Schoellkopf, B., Patt, J. (eds.) Advances in Neural Information Processing Systems, vol. 18, pp. 771–778. MIT Press, Cambridge (2006)

    Google Scholar 

  3. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127–149 (2009), doi:10.1016/j.cosrev.2009.03.005.

    Article  Google Scholar 

  4. Nikolić, D., Häusler, S., Singer, W., Maass, W.: Distributed fading memory for stimulus properties in the primary visual cortex. PLoS Biol 7(12), e1000260 (2009), doi:10.1371/journal.pbio.1000260.

    Google Scholar 

  5. Hajnal, M.A., Lőrincz, A.: Critical echo state networks. In: Kollias, S.D., Stafylopatis, A., Duch, W., Oja, E. (eds.) ICANN 2006. Part I. NCS, vol. 4131, pp. 658–667. Springer, Heidelberg (2006)

    Google Scholar 

  6. Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)

    Article  Google Scholar 

  7. Obst, O., Boedecker, J.: Guided self-organization of input-driven recurrent neural networks. In: Guided Self-Organization, pp. 319–340 (2014)

    Google Scholar 

  8. van Vreeswick, C., Sompolinsky, H.: Chaotic balanced state in a model of cortical circuits. Neural Computation 10, 1321–1371 (1998)

    Article  Google Scholar 

  9. Jaeger, H.: The ’echo state’ approach to analysing and training recurrent neural networks. In: GMD Report 148, GMD German National Research Insitute for Computer Science (2001), http://www.gmd.de/People/Herbert.Jaeger/Publications.html

  10. Jaeger, H.: Adaptive nonlinear system identification with echo state networks. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems 15 (NIPS 2002). MIT Press, Cambridge (2003)

    Google Scholar 

  11. Buechner, M., Young, P.: A Tighter Bound for the Echo State Property. IEEE Transaction on Neural Networks 17(3), 820–824 (2006)

    Google Scholar 

  12. Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Networks 35, 1–20 (2012)

    Article  MATH  Google Scholar 

  13. Steil, J.: Online stability of backpropagation-decorrelation recurrent learning. Neurocomputing 69(79), 642–650 (2006), http://dx.doi.org/10.1016/j.neucom.2005.12.012

    Article  Google Scholar 

  14. Mayer, N.M., Browne, M.: Self-prediction in echo state networks. In: Proceedings of The First International Workshop on Biological Inspired Approaches to Advanced Information Technology (BioAdIt 2004), Lausanne (2004)

    Google Scholar 

  15. Mayer, N.M., Obst, O., Yu-Chen, C.: Time series causality inference using echo state networks. In: Vigneron, V., Zarzoso, V., Moreau, E., Gribonval, R., Vincent, E. (eds.) LVA/ICA 2010. LNCS, vol. 6365, pp. 279–286. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  16. Herbert, J.S., Pascalis, O.: Memory development. In: Slator, A., Lewis, M., eds.: Introduction to Infant Development. Oxford University Press (2007)

    Google Scholar 

  17. Uhlig, M., Levina, A., Geisel, T., Herrmann, J.M.: Critical dynamics in associative memory networks. Front Comput. Neurosci. 7(87) (2013)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Mayer, N.M. (2014). Adaptive Critical Reservoirs with Power Law Forgetting of Unexpected Input Sequences. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11179-7_7

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11178-0

  • Online ISBN: 978-3-319-11179-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics