Skip to main content

Continuously Deep Recurrent Neural Networks

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases. Research Track (ECML PKDD 2024)

Abstract

The architecture of multi-layer dynamic neural systems traditionally contains recurrent neurons organized in successive well-defined layers. In this paper, we introduce a new class of recurrent neural models based on a fundamentally different type of topological organization than the conventionally used deep recurrent networks, and directly inspired by the way cortical networks in the brain process information at multiple temporal scales. We explore the novel paradigm from the perspective of Reservoir Computing (RC), a popular approach to designing efficiently trainable recurrent neural networks, and introduce the Continuously-Deep Echo State Network (C-DESN). The proposed C-DESN architecture comprises a reservoir layer of untrained recurrent neurons connected in a biologically inspired exponentially decaying pattern based on distance. The depth of the resulting neural information processing system is modulated by a single depth hyperparameter that controls the extent of local connectivity. Mathematically, we analyze the dynamical stability properties of the continuously deep reservoir, providing a rigorous bound on the resulting eigenspectrum. Empirically, we show that the novel recurrent architecture is biased toward tunable temporal resolution processing in the same way as conventional deep recurrent neural networks. Additionally, our experiments on short-term memory capacity and real-world time-series reconstruction demonstrate how the depth hyperparameter of C-DESN can effectively regulate the temporal scale in the reservoir’s dynamic behavior.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    The largest modulus of an eigenvalue of the matrix.

References

  1. Bacciu, D., et al.: Teaching-trustworthy autonomous cyber-physical applications through human-centred intelligence. In: 2021 IEEE International Conference on Omni-Layer Intelligent Systems (COINS), pp. 1–6. IEEE (2021)

    Google Scholar 

  2. Bauer, F.L., Fike, C.T.: Norms and exclusion theorems. Numer. Math. 2(1), 137–141 (1960)

    Article  MathSciNet  Google Scholar 

  3. Chaudhuri, R., Bernacchia, A., Wang, X.J.: A diversity of localized timescales in network activity. Elife 3, e01239 (2014)

    Google Scholar 

  4. Chicaiza, K.O., Benalcázar, M.E.: A brain-computer interface for controlling IoT devices using EEG signals. In: 2021 IEEE Fifth Ecuador Technical Chapters Meeting (ETCM), pp. 1–6 (2021). https://doi.org/10.1109/ETCM53643.2021.9590711

  5. Chien, H.Y.S., Honey, C.J.: Constructing and forgetting temporal context in the human cerebral cortex. Neuron 106(4), 675–686 (2020)

    Article  Google Scholar 

  6. De Caro, V., Gallicchio, C., Bacciu, D.: Continual adaptation of federated reservoirs in pervasive environments. Neurocomputing 556, 126638 (2023)

    Article  Google Scholar 

  7. Dominey, P.F., Ellmore, T.M., Ventre-Dominey, J.: Effects of connectivity on narrative temporal processing in structured reservoir computing. In: 2022 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2022)

    Google Scholar 

  8. Gallicchio, C.: Short-term memory of deep RNN. In: Proceedings of the 26th European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning (ESANN 2018), pp. 633–638 (2018)

    Google Scholar 

  9. Gallicchio, C., Micheli, A.: Echo state property of deep reservoir computing networks. Cogn. Comput. 9(3), 337–350 (2017). https://doi.org/10.1007/s12559-017-9461-9

    Article  Google Scholar 

  10. Gallicchio, C., Micheli, A.: Why layering in recurrent neural networks? A deepesn survey. In: 2018 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2018)

    Google Scholar 

  11. Gallicchio, C., Micheli, A., Pedrelli, L.: Deep reservoir computing: a critical experimental analysis. Neurocomputing 268, 87–99 (2017). https://doi.org/10.1016/j.neucom.2016.12.089

    Article  Google Scholar 

  12. Hammami, N., Sellam, M.: Tree distribution classifier for automatic spoken Arabic digit recognition. In: 2009 International Conference for Internet Technology and Secured Transactions (ICITST), pp. 1–4 (2009). https://doi.org/10.1109/ICITST.2009.5402575

  13. Hamooni, H., Mueen, A.: Dual-domain hierarchical classification of phonetic time series. In: 2014 IEEE International Conference on Data Mining, pp. 160–169 (2014). https://doi.org/10.1109/ICDM.2014.92

  14. Jaeger, H.: Short term memory in echo state networks. Technical report 152, German National Research Institute for Computer Science (2002)

    Google Scholar 

  15. Jaeger, H.: Short term memory in echo state networks. gmd-report 152. In: GMD-German National Research Institute for Computer Science (2002). http://www.faculty.jacobs-university.de/hjaeger/pubs/STMEchoStatesTechRep.pdf.Citeseer

  16. Jaeger, H., Haas, H.: Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304(5667), 78–80 (2004)

    Article  Google Scholar 

  17. Mante, V., Sussillo, D., Shenoy, K.V., Newsome, W.T.: Context-dependent computation by recurrent dynamics in prefrontal cortex. Nature 503(7474), 78–84 (2013)

    Article  Google Scholar 

  18. Nakajima, K., Fischer, I.: Reservoir Computing. Springer, Singapore (2021). https://doi.org/10.1007/978-981-13-1687-6

    Book  Google Scholar 

  19. Tanaka, G., et al.: Recent advances in physical reservoir computing: a review. Neural Netw. 115, 100–123 (2019)

    Article  Google Scholar 

  20. Tortorella, D., Gallicchio, C., Micheli, A.: Hierarchical dynamics in deep echo state networks. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds.) ICANN 2022. LNCS, vol. 13531, pp. 668–679. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-15934-3_55

    Chapter  Google Scholar 

  21. Verstraeten, D., Schrauwen, B., d’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007)

    Article  Google Scholar 

  22. Villar, J.R., Vergara, P., Menéndez, M., de la Cal, E., González, V.M., Sedano, J.: Generalized models for the classification of abnormal movements in daily life and its applicability to epilepsy convulsion recognition. Int. J. Neural Syst. 26(06), 1650037 (2016). https://doi.org/10.1142/S0129065716500374

    Article  Google Scholar 

  23. Yildiz, I.B., Jaeger, H., Kiebel, S.J.: Re-visiting the echo state property. Neural Netw. 35, 1–9 (2012)

    Article  Google Scholar 

Download references

Acknowledgments

This work has been supported by EU-EIC EMERGE (Grant No. 101070918), by NEURONE, a project funded by the Italian Ministry of University and Research (PRIN 20229JRTZA), by PNRR - M4C2 - Investimento 1.3, Partenariato Esteso PE00000013 - “FAIR - Future Artificial Intelligence Research” - Spoke 1 “Human-centered AI”, funded by the European Commission under the NextGeneration EU programme, and by the project BrAID under the Bando Ricerca Salute 2018 - Regional public call for research and development projects aimed at supporting clinical and organisational innovation processes of the Regional Health Service - Regione Toscana.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrea Ceni .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ceni, A., Dominey, P.F., Gallicchio, C., Micheli, A., Pedrelli, L., Tortorella, D. (2024). Continuously Deep Recurrent Neural Networks. In: Bifet, A., Davis, J., Krilavičius, T., Kull, M., Ntoutsi, E., Žliobaitė, I. (eds) Machine Learning and Knowledge Discovery in Databases. Research Track. ECML PKDD 2024. Lecture Notes in Computer Science(), vol 14947. Springer, Cham. https://doi.org/10.1007/978-3-031-70368-3_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-70368-3_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-70367-6

  • Online ISBN: 978-3-031-70368-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics