Abstract
Reservoir computing is a framework which uses the non-linear internal dynamics of a recurrent neural network to perform complex non-linear transformations of the input. This enables reservoirs to carry out a variety of tasks involving the processing of time-dependent or sequential-based signals. Reservoirs are particularly suited for tasks that require memory or the handling of temporal sequences, common in areas such as speech recognition, time series prediction, and signal processing. Learning is restricted to the output layer and can be thought of as “reading out” or “selecting from” the states of the reservoir. With all but the output weights fixed they do not have the costly and difficult training associated with deep neural networks. However, while the reservoir computing framework shows a lot of promise in terms of efficiency and capability, it can be unreliable. Existing studies show that small changes in hyperparameters can markedly affect the network’s performance. Here we studied the role of network topologies in reservoir computing in the carrying out of three conceptually different tasks: working memory, perceptual decision making, and chaotic time-series prediction. We implemented three different network topologies (ring, lattice, and random) and tested reservoir network performances on the tasks. We then used algebraic topological tools of directed simplicial cliques to study deeper connections between network topology and function, making comparisons across performance and linking with existing reservoir research.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aharoni, R., Berger, E., Meshulam, R.: Eigenvalues and homology of flag complexes and vector representations of graphs. GAFA, Geom. Funct. Anal. 15(3), 555–566 (2005). https://doi.org/10.1007/s00039-005-0516-9
Boccaletti, S., Latora, V., Moreno, Y., Chavez, M., Hwang, D.-U.: Complex networks: structure and dynamics. Phys. Rep. 424(4–5), 175–308 (2006). https://doi.org/10.1016/j.physrep.2005.10.009
Bohland, W., Minai, A.A.: Efficient associative memory using small-world architecture. Neurocomputing 38, 489–496 (2001). https://doi.org/10.1016/S0925-2312(01)00378-2
Canaday, D.: Modeling and control of dynamical systems with reservoir computing. J. Phys. Complexity 2(3) (2019). https://doi.org/10.1088/2632-072X/ac24f3.
Chung, F.: Spectral Graph Theory, vol. 92. in CBMS Regional Conference Series in Mathematics, vol. 92. Providence, Rhode Island: American Mathematical Society (1996). https://doi.org/10.1090/cbms/092.
Cover, T.M.: Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Trans. Electron. Comput. EC-14(3), 326–334 (1965). https://doi.org/10.1109/PGEC.1965.264137.
Cucchi, M., Abreu, S., Ciccone, G., Brunner, D., Kleemann, H.: Hands-on reservoir computing: a tutorial for practical implementation. Neuromorphic Comput. Eng. 2(3), 032002 (2022). https://doi.org/10.1088/2634-4386/ac7db7
Dale, M., O’Keefe, S., Sebald, A., Stepney, S., Trefzer, M.A.: Reservoir computing quality: connectivity and topology. Nat. Comput. 20(2), 205–216 (2021). https://doi.org/10.1007/s11047-020-09823-1
Damicelli, F., Hilgetag, C.C., Goulas, A.: Brain connectivity meets reservoir computing. PLoS Comput. Biol. 18(11), e1010639 (2022). https://doi.org/10.1371/journal.pcbi.1010639
Deng, Z., Zhang, Y.: Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans. Neural Netw. 18(5), 1364–1375 (2007)
Erkaymaz, O., Özer, M., Yumuşak, N.: Performance analysis of a feed-forward artifical neural network with small-world topology. Procedia Technol. 1, 291–296 (2012). https://doi.org/10.1016/j.protcy.2012.02.062
Haluszczynski, A., Räth, C.: Good and bad predictions: assessing and improving the replication of chaotic attractors by means of reservoir computing. Chaos Interdisc. J. Nonlinear Sci. 29(10), 103143 (2019). https://doi.org/10.1063/1.5118725
Inubushi, M., Yoshimura, K., Ikeda, Y., Nagasawa, Y.: On the characteristics and structures of dynamical systems suitable for reservoir computing. In: Nakajima, K., Fischer, I. (eds.) Reservoir Computing. NCS, pp. 97–116. Springer, Singapore (2021). https://doi.org/10.1007/978-981-13-1687-6_5
Jaeger, H.: Short term memory in echo state networks (2001). https://doi.org/10.24406/PUBLICA-FHG-291107.
Kawai, Y., Park, J., Asada, M.: A small-world topology enhances the echo state property and signal propagation in reservoir computing. Neural Netw. 112, 15–23 (2019). https://doi.org/10.1016/j.neunet.2019.01.002
Von Lautz, A., Herding, J., Blankenburg, F.: Neuronal signatures of a random-dot motion comparison task. Neuroimage 193, 57–66 (2019). https://doi.org/10.1016/j.neuroimage.2019.02.071
Lovász, L.: Large Networks and Graph Limits, vol. 60. Colloquium Publications. American Mathematical Society, Providence (2012). https://doi.org/10.1090/coll/060.
Manevitz, L., Hazan, H.: Stability and topology in reservoir computing. In: Sidorov, G., Hernández Aguirre, A., Reyes García, C.A. (eds.) MICAI 2010. LNCS (LNAI), vol. 6438, pp. 245–256. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-16773-7_21
Mohar, B., Alavi, Y., Chartrand, G., Oellermann, O.: The Laplacian spectrum of graphs. Graph Theory Comb. Appl. 2, 871–898 (1991)
Momani, S., Pham, V.-T., Wei, Z.: Directed simplicial complexes in brain real-world networks. Eur. Phys. J. Spec. Top. 233(4), 807–816 (2024). https://doi.org/10.1140/epjs/s11734-024-01159-6
Munkres, J.R.: Elements of Algebraic Topology, 1st edn. CRC Press, Boca Raton (2018). https://doi.org/10.1201/9780429493911.
Pathak, J., et al.: Hybrid forecasting of chaotic processes: Using machine learning in conjunction with a knowledge-based model. Chaos Interdisc. J. Nonlinear Sci. 28(4), 041101 (2018). https://doi.org/10.1063/1.5028373
Perin, R., Berger, T.K., Markram, H.: A synaptic organizing principle for cortical neuronal groups. Proc. Natl. Acad. Sci. USA 108(13), 5419–5424 (2011). https://doi.org/10.1073/pnas.1016051108
Racca, A., Magri, L.: Robust optimization and validation of echo state networks for learning chaotic dynamics (2021). https://doi.org/10.48550/ARXIV.2103.03174
Reimann, M.W., et al.: Cliques of neurons bound into cavities provide a missing link between structure and function. Front. Comput. Neurosci. 11, 48 (2017). https://doi.org/10.3389/fncom.2017.00048
Rodan, A., Tino, P.: Minimum complexity echo state network. IEEE Trans. Neural Netw. 22(1), 131–144 (2010). https://doi.org/10.1109/TNN.2010.2089641
Rodriguez, N., Izquierdo, E., Ahn, Y.-Y.: Optimal modularity and memory capacity of neural reservoirs. Netw. Neurosci. 3(2), 551–566 (2019). https://doi.org/10.1162/netn_a_00082
Rorie, A.E., Newsome, W.T.: A general mechanism for decision-making in the human brain? Trends Cogn. Sci. 9(2), 41–43 (2005). https://doi.org/10.1016/j.tics.2004.12.007
Shadlen, M.N., Newsome, W.T.: Neural basis of a perceptual decision in the parietal cortex (area LIP) of the rhesus monkey. J. Neurophysiol. 86(4), 1916–1936 (2001). https://doi.org/10.1152/jn.2001.86.4.1916
Song, S., Sjöström, P.J., Reigl, M., Nelson, S., Chklovskii, D.B.: Highly nonrandom features of synaptic connectivity in local cortical circuits. PLoS Biol. 3(3), e68 (2005). https://doi.org/10.1371/journal.pbio.0030068
Watts, D.J., Strogatz, S.H.: Collective dynamics of ‘small-world’ networks. Nature 393(6684), 440–442 (1998). https://doi.org/10.1038/30918
Acknowledgements
The first author would like to thank the Faculty of Engineering, University of Bristol, for a visiting scholarship and the Northern Ireland Department for the Economy for a PhD studentship.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
McAllister, J., Wade, J., Houghton, C., O’Donnell, C. (2024). Topological and Simplicial Features in Reservoir Computing Networks. In: Zheng, H., Glass, D., Mulvenna, M., Liu, J., Wang, H. (eds) Advances in Computational Intelligence Systems. UKCI 2024. Advances in Intelligent Systems and Computing, vol 1462. Springer, Cham. https://doi.org/10.1007/978-3-031-78857-4_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-78857-4_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-78856-7
Online ISBN: 978-3-031-78857-4
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)