Abstract
Reservoir computing provides a promising approach to efficient training of recurrent neural networks, by exploiting the computational properties of the reservoir structure. Various approaches, ranging from suitable initialization to reservoir optimization by training have been proposed. In this paper we take a closer look at short-term memory capacity, introduced by Jaeger in case of echo state networks. Memory capacity has recently been investigated with respect to criticality, the so called edge of chaos, when the network switches from a stable regime to an unstable dynamic regime. We calculate memory capacity of the networks for various input data sets, both random and structured, and show how the data distribution affects the network performance. We also investigate the effect of reservoir sparsity in this context.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Bertschinger, N., Natschläger, T.: Real-time computation at the edge a of chaos in recurrent neural networks. Neural Computation 16(7), 1413–1436 (2004)
Boedecker, J., Obst, O., Lizier, J., Mayer, N., Asada, M.: Information processing in echo state networks at the edge of chaos. Theory in Biosciences 131, 205–213 (2012)
Büsing, L., Schrauwen, B., Legenstein, R.: Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons. Neural Computation 22(5), 1272–1311 (2010)
Hermans, M., Schrauwen, B.: Memory in linear recurrent neural networks in continuous time. Neural Networks 23, 341–355 (2010)
Huebner, U., Abraham, N., Weiss, C.: Dimensions and entropies of chaotic intensity pulsations in a single-mode far-infrared NH3 laser. Physics Reviews A 40(11), 6354–6365 (1989)
Jaeger, H.: Short term memory in echo state networks. Tech. Rep. GMD Report 152, German National Research Center for Information Technology (2002)
Jaeger, H.: Echo state network. Scholarpedia 2(9) (2007)
Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful? In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154. MIT Press (2007)
Lukosevicius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Computer Science Review 3(3), 127–149 (2009)
Ozturk, M., Xu, C., Principe, J.: Analysis and design of echo state networks. Neural Computation 19, 111–138 (2006)
Rodan, A., Tiňo, P.: Minimum complexity echo state network. IEEE Transaction on Neural Networks 21(1), 131–144 (2011)
Schrauwen, B., Buesing, L., Legenstein, R.: On computational power and the order-chaos phase transition in reservoir computing. In: Advances in Neural Information Processing Systems, pp. 1425–1432 (2009)
Sprott, J.: Chaos and Time-Series Analysis. Oxford University Press (2003)
Verstraeten, D., Dambre, J., Dutoit, X., Schrauwen, B.: Memory versus non-linearity in reservoirs. In: International Joint Conference on Neural Networks, pp. 1–8 (2010)
White, O., Lee, D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Physical Review Letters 92(14), 148102 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Barančok, P., Farkaš, I. (2014). Memory Capacity of Input-Driven Echo State Networks at the Edge of Chaos. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_6
Download citation
DOI: https://doi.org/10.1007/978-3-319-11179-7_6
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11178-0
Online ISBN: 978-3-319-11179-7
eBook Packages: Computer ScienceComputer Science (R0)