Skip to main content

Hierarchical Dynamics in Deep Echo State Networks

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13531))

Abstract

Reservoir computing (RC) is a popular approach to the efficient design of recurrent neural networks (RNNs), where the dynamical part of the model is initialized and left untrained. Deep echo state networks (ESNs) combined the deep learning approach with RC, by structuring the reservoir in multiple layers, thus offering the striking advantage of encoding the input sequence on different time-scales. A key factor for the effectiveness of ESNs is the echo state property (ESP), which ensures the asymptotic stability of the reservoir dynamics. In this paper, we perform an in-depth theoretical analysis of asymptotic dynamics in Deep ESNs with different contractivity hierarchies, offering a more accurate sufficient condition of the ESP. We investigate how different hierarchies of contractivity affect memory capacity and predictive performance in regression tasks, concluding that structuring reservoir layers in decreasing contractivity is the best design choice. The results of this paper can potentially be applied also to the design of fully-trained RNNs.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    More details on uniform convergence requirements can be found in [24].

  2. 2.

    The reservoir layers \(F^{(\ell )}\) defined in Sect. 2 have state space \(\mathcal {X}_F = \mathcal {X} = [-1,1]^{N_R}\), and input space either \(\mathcal {U}_F = \mathcal {U} \subset \mathbb {R}^{N_U}\) if \(\ell = 1\) or \(\mathcal {U}_F = \mathcal {X}\) if \(\ell > 1\).

References

  1. Doyne Farmer, J.: Chaotic attractors of an infinite-dimensional dynamical system. Physica D Nonlinear Phenom. 4(3), 366–393 (1982). https://doi.org/10.1016/0167-2789(82)90042-2

    Article  MathSciNet  MATH  Google Scholar 

  2. Gallicchio, C.: Short-term memory of deep RNN. In: Proceedings of the 26th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2018, pp. 633–638 (2018)

    Google Scholar 

  3. Gallicchio, C., Micheli, A.: Architectural and Markovian factors of echo state networks. Neural Netw. 24(5), 440–456 (2011)

    Article  Google Scholar 

  4. Gallicchio, C., Micheli, A.: Echo state property of deep reservoir computing networks. Cogn. Comput. 9(3), 337–350 (2017). https://doi.org/10.1007/s12559-017-9461-9

    Article  Google Scholar 

  5. Gallicchio, C., Micheli, A.: Reservoir topology in deep echo state networks. In: Tetko, I.V., Kůrková, V., Karpov, P., Theis, F. (eds.) ICANN 2019. LNCS, vol. 11731, pp. 62–75. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-30493-5_6

    Chapter  Google Scholar 

  6. Gallicchio, C., Micheli, A.: Richness of deep echo state network dynamics. In: Rojas, I., Joya, G., Catala, A. (eds.) IWANN 2019. LNCS, vol. 11506, pp. 480–491. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-20521-8_40

    Chapter  Google Scholar 

  7. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017)

    Article  Google Scholar 

  8. Gallicchio, C., Micheli, A., Pedrelli, L.: Design of deep echo state networks. Neural Netw. 108, 33–47 (2018). https://doi.org/10.1016/j.neunet.2018.08.002

    Article  Google Scholar 

  9. Gallicchio, C., Micheli, A., Pedrelli, L.: Comparison between DeepESNs and gated RNNs on multivariate time-series prediction. In: Proceedings of the 27th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, ESANN 2019, pp. 619–624 (2019)

    Google Scholar 

  10. Gallicchio, C., Micheli, A., Pedrelli, L.: Hierarchical temporal representation in linear reservoir computing. Smart Innov. Syst. Technol. 102, 119–129 (2019). https://doi.org/10.1007/978-3-319-95098-3_11

  11. Gallicchio, C., Micheli, A., Silvestri, L.: Local Lyapunov exponents of deep echo state networks. Neurocomputing 298, 34–45 (2018). https://doi.org/10.1016/j.neucom.2017.11.073

    Article  Google Scholar 

  12. Hammer, B., Tiňo, P.: Recurrent neural networks with small weights implement definite memory machines. Neural Comput. 15(8), 1897–1929 (2003). https://doi.org/10.1162/08997660360675080

    Article  MATH  Google Scholar 

  13. Jaeger, H.: The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Technical report 148, German National Research Institute for Computer Science (2001)

    Google Scholar 

  14. Jaeger, H.: Short term memory in echo state networks. Technical report 152, German National Research Institute for Computer Science (2002)

    Google Scholar 

  15. Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004). https://doi.org/10.1126/science.1091277

    Article  Google Scholar 

  16. LeCun, Y., Bengio, Y., Hinton, G.: Deep learning. Nature 521(7553), 436–444 (2015). https://doi.org/10.1038/nature14539

    Article  Google Scholar 

  17. Lukoševičius, M.: A practical guide to applying echo state networks. In: Montavon, G., Orr, G.B., Müller, K.-R. (eds.) Neural Networks: Tricks of the Trade. LNCS, vol. 7700, pp. 659–686. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-35289-8_36

    Chapter  Google Scholar 

  18. Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009). https://doi.org/10.1016/j.cosrev.2009.03.005

    Article  MATH  Google Scholar 

  19. Mackey, M.C., Glass, L.: Oscillation and chaos in physiological control systems. Science 197(4300), 287–289 (1977). https://doi.org/10.1126/science.267326

    Article  MATH  Google Scholar 

  20. Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)

    Article  Google Scholar 

  21. Verstraeten, D., Schrauwen, B., d’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007)

    Article  Google Scholar 

  22. Weigend, A.S.: Time Series Prediction: Forecasting The Future And Understanding The Past. Routledge, May 2018. https://doi.org/10.4324/9780429492648

  23. White, O.L., Lee, D.D., Sompolinsky, H.: Short-term memory in orthogonal neural networks. Phys. Rev. Lett. 92(14), 148102 (2004). https://doi.org/10.1103/physrevlett.92.148102

    Article  Google Scholar 

  24. Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012)

    Article  Google Scholar 

Download references

Acknowledgements

This work was partially funded by the project BrAID under the Bando Ricerca Salute 2018 - Regional public call for research and development projects aimed at supporting clinical and organizational innovation processes of the Regional Health Service - Regione Toscana.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Domenico Tortorella .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tortorella, D., Gallicchio, C., Micheli, A. (2022). Hierarchical Dynamics in Deep Echo State Networks. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13531. Springer, Cham. https://doi.org/10.1007/978-3-031-15934-3_55

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-15934-3_55

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-15933-6

  • Online ISBN: 978-3-031-15934-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics