Skip to main content

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11731))

Included in the following conference series:

  • 5648 Accesses

Abstract

In this paper, we propose a model of ESNs that eliminates critical dependence on hyper-parameters, resulting in networks that provably cannot enter a chaotic regime and, at the same time, denotes nonlinear behaviour in phase space characterised by a large memory of past inputs, comparable to the one of linear networks. Our contribution is supported by experiments corroborating our theoretical findings, showing that the proposed model displays dynamics that are rich-enough to approximate many common nonlinear systems used for benchmarking.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bertschinger, N., Natschläger, T.: Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput. 16(7), 1413–1436 (2004). https://doi.org/10.1162/089976604323057443

    Article  MATH  Google Scholar 

  2. Bianchi, F.M., Scardapane, S., Løkse, S., Jenssen, R.: Reservoir computing approaches for representation and classification of multivariate time series. arXiv preprint arXiv:1803.07870 (2018)

  3. Bianchi, F.M., Scardapane, S., Uncini, A., Rizzi, A., Sadeghian, A.: Prediction of telephone calls load using echo state network with exogenous variables. Neural Netw. 71, 204–213 (2015). https://doi.org/10.1016/j.neunet.2015.08.010

    Article  Google Scholar 

  4. Ceni, A., Ashwin, P., Livi, L.: Interpreting recurrent neural networks behaviour via excitable network attractors. Cogn. Comput. (2019). https://doi.org/10.1007/s12559-019-09634-2

  5. Dambre, J., Verstraeten, D., Schrauwen, B., Massar, S.: Information processing capacity of dynamical systems. Sci. Rep. 2 (2012). https://doi.org/10.1038/srep00514

  6. Gallicchio, C.: Chasing the echo state property. arXiv preprint arXiv:1811.10892 (2018)

  7. Gallicchio, C., Micheli, A., Pedrelli, L.: Comparison between DeepESNs and gated RNNs on multivariate time-series prediction. arXiv preprint arXiv:1812.11527 (2018)

  8. Ganguli, S., Huh, D., Sompolinsky, H.: Memory traces in dynamical systems. Proc. Nat. Acad. Sci. 105(48), 18970–18975 (2008). https://doi.org/10.1073/pnas.0804451105

    Article  Google Scholar 

  9. Goudarzi, A., Marzen, S., Banda, P., Feldman, G., Teuscher, C., Stefanovic, D.: Memory and information processing in recurrent neural networks. arXiv preprint arXiv:1604.06929 (2016)

  10. Grigoryeva, L., Ortega, J.P.: Echo state networks are universal. Neural Netw. 108, 495–508 (2018). https://doi.org/10.1016/j.neunet.2018.08.025

    Article  Google Scholar 

  11. Inubushi, M., Yoshimura, K.: Reservoir computing beyond memory-nonlinearity trade-off. Sci. Rep. 7(1), 10199 (2017). https://doi.org/10.1038/s41598-017-10257-6

    Article  Google Scholar 

  12. Jaeger, H.: Short term memory in echo state networks, vol. 5. GMD-Forschungszentrum Informationstechnik (2002)

    Google Scholar 

  13. Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004). https://doi.org/10.1126/science.1091277

    Article  Google Scholar 

  14. Legenstein, R., Maass, W.: Edge of chaos and prediction of computational performance for neural circuit models. Neural Netw. 20(3), 323–334 (2007). https://doi.org/10.1016/j.neunet.2007.04.017

    Article  MATH  Google Scholar 

  15. Livi, L., Bianchi, F.M., Alippi, C.: Determination of the edge of criticality in echo state networks through Fisher information maximization. IEEE Trans. Neural Netw. Learn. Syst. 29(3), 706–717 (2018). https://doi.org/10.1109/TNNLS.2016.2644268

    Article  MathSciNet  Google Scholar 

  16. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002). https://doi.org/10.1162/089976602760407955

    Article  MATH  Google Scholar 

  17. Manjunath, G., Jaeger, H.: Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Neural Comput. 25(3), 671–696 (2013). https://doi.org/10.1162/NECO_a_00411

    Article  MathSciNet  MATH  Google Scholar 

  18. Marzen, S.: Difference between memory and prediction in linear recurrent networks. Phys. Rev. E 96(3), 032308 (2017). https://doi.org/10.1103/PhysRevE.96.032308

    Article  Google Scholar 

  19. Palumbo, F., Gallicchio, C., Pucci, R., Micheli, A.: Human activity recognition using multisensor data fusion based on reservoir computing. J. Ambient Intell. Smart Environ. 8(2), 87–107 (2016)

    Article  Google Scholar 

  20. Pathak, J., Hunt, B., Girvan, M., Lu, Z., Ott, E.: Model-free prediction of large spatiotemporally chaotic systems from data: a reservoir computing approach. Phys. Rev. Lett. 120(2), 024102 (2018)

    Article  Google Scholar 

  21. Pathak, J., Lu, Z., Hunt, B.R., Girvan, M., Ott, E.: Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data. Chaos: Interdisc. J. Nonlinear Sci. 27(12), 121102 (2017). https://doi.org/10.1063/1.5010300

    Article  MathSciNet  MATH  Google Scholar 

  22. Pathak, J., et al.: Hybrid forecasting of chaotic processes: using machine learning in conjunction with a knowledge-based model. Chaos: Interdisc. J. Nonlinear Sci. 28(4), 041101 (2018). https://doi.org/10.1063/1.5028373

    Article  MathSciNet  Google Scholar 

  23. Rajan, K., Abbott, L.F., Sompolinsky, H.: Stimulus-dependent suppression of chaos in recurrent neural networks. Phys. Rev. E 82(1), 011903 (2010). https://doi.org/10.1103/PhysRevE.82.011903

    Article  Google Scholar 

  24. Rivkind, A., Barak, O.: Local dynamics in trained recurrent neural networks. Phys. Rev. Lett. 118, 258101 (2017). https://doi.org/10.1103/PhysRevLett.118.258101

    Article  Google Scholar 

  25. Sompolinsky, H., Crisanti, A., Sommers, H.J.: Chaos in random neural networks. Phys. Rev. Lett. 61(3), 259 (1988). https://doi.org/10.1103/PhysRevLett.61.259

    Article  MathSciNet  Google Scholar 

  26. Sussillo, D., Barak, O.: Opening the black box: low-dimensional dynamics in high-dimensional recurrent neural networks. Neural Comput. 25(3), 626–649 (2013). https://doi.org/10.1162/NECO_a_00409

    Article  MathSciNet  MATH  Google Scholar 

  27. Tiňo, P., Rodan, A.: Short term memory in input-driven linear dynamical systems. Neurocomputing 112, 58–63 (2013). https://doi.org/10.1016/j.neucom.2012.12.041

    Article  Google Scholar 

  28. Verstraeten, D., Dambre, J., Dutoit, X., Schrauwen, B.: Memory versus non-linearity in reservoirs. In: IEEE International Joint Conference on Neural Networks, pp. 1–8. IEEE, Barcelona (2010)

    Google Scholar 

  29. Verzelli, P., Alippi, C., Livi, L.: Echo state networks with self-normalizing activations on the hyper-sphere. arXiv preprint arXiv:1903.11691 (2019)

  30. Verzelli, P., Livi, L., Alippi, C.: A characterization of the edge of criticality in binary echo state networks. In: 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP), pp. 1–6. IEEE (2018). https://doi.org/10.1109/MLSP.2018.8516959

  31. Wainrib, G., Galtier, M.N.: A local echo state property through the largest Lyapunov exponent. Neural Netw. 76, 39–45 (2016). https://doi.org/10.1016/j.neunet.2015.12.013

    Article  MATH  Google Scholar 

  32. Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012). https://doi.org/10.1016/j.neunet.2012.07.005

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pietro Verzelli .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Verzelli, P., Alippi, C., Livi, L. (2019). Hyper-spherical Reservoirs for Echo State Networks. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Workshop and Special Sessions. ICANN 2019. Lecture Notes in Computer Science(), vol 11731. Springer, Cham. https://doi.org/10.1007/978-3-030-30493-5_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30493-5_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30492-8

  • Online ISBN: 978-3-030-30493-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics