Abstract
In the echo state networks, both reservoir states and network structure are essential for the performance of reservoir computing. In neuroscience, it has been confirmed that a single neuron can adaptively change its intrinsic excitability to fit various synaptic inputs. This mechanism is called intrinsic plasticity (IP) mechanism in the literature. This adaptive adjustment of neuronal response to external inputs is believed to maximize input-output mutual information. Meanwhile, the existence of multi-clustered structure with small-world-like property in the brain has been strongly supported by many neurophysiological experiments. Thus, it is advisable to consider both the intrinsic plasticity and multi-clustered structure of a reservoir network, rather than a random network with a non-adaptive reservoir response. In this paper, reservoir models with neuronal intrinsic plasticity and multi-clustered structure are investigated. The effects of two types of IP rules on the performance of several computational tasks have been investigated in detail by combining neuronal IP with multi-clustered reservoir structures. The first type is the Triesch’s IP rule, which drives the output activities of neurons to approximate exponential distributions; another is the Li’s IP rule, which generates a Gaussian distribution of neuronal firing. Results show that both the multi-clustered structures and IP rules can improve the computational accuracy of reservoir computing. However, before the application of the IP rules, the enhancement of computational performance for multi-clustered reservoirs is minor. Both IP rules contribute to improvement of the computational performance, where the Li’s IP rule is more advantageous than the Triesch’s IP. The results indicate that the combination of multi-clustered reservoir structures and IP learning can increase the dynamic diversity of reservoir states, especially for the IP’s learning. The adaptive tuning of reservoir states based on IP improves the dynamic complexity of neuronal activity, which helps train output weights. This biologically inspired reservoir model may give insights for the optimization of reservoir computing.








Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Jaeger H. A tutorial on training recurrent neural networks, covering BPTT, RURL, EKF and the echo state network approach, Journal, Technical Report GMD Report 159, German National Research Center for Information Technology. 2002.
Jaeger H, Hass H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless telecommunication. Science 2004;5667:78–80.
Li D, Han M, Wang J. Chaotic time series prediction based on a novel robust echo state network. IEEE Trans Neural Netw Learn Syst 2012;23(5):787–99.
Jaeger H. Adaptive nonlinear system identification with echo state networks. Advances in neural information processing systems; 2004. p. 78–80.
Skowronski M D, Harris JG. Noise-robust automatic speech recognition using a predictive echo state network. IEEE Trans Audio Speech Lang Process 2007;15(5):1724–1730.
Skowronski M D, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: IEEE International symposium on circuit system; 2006. p. 3153–3156.
Scardapane S, Uncini A. Semi-supervised echo state networks for audio classification. Cogn Comput.; 2016. p. 1–11.
Lin X, Yang Z, Song Y. Short-term stock price prediction based on echo state networks. Expert Syst Appl 2009;36(3):7313– 17.
Meftah B, Lzoray O, Benyettou A. A novel approach using echo state networks for microscopic cellular image segmentation. Cogn Comput 2016;8(2):1–9.
Tong M H, Bicket A D, Christiansen E M, Cottrell GW. Clustered complex echo state networks for traffic forecasting with prior knowledge, instrumentation and measurement technology conference, I2MTC; 2007. p. 1–5.
Jaeger H, Lukoeviius M, Popovici D, Siewert U. Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 2007;20(3):335–52.
Najibi E, Rostami H. SCESN, SPESN, SWESN: three recurrent neural echo state networks with clustered reservoirs for prediction of nonlinear and chaotic time series. Appl Intell 2015;43(2):460–72.
Liebald B. Exploration of effects of different network topologies on the ESN signal crosscorrelation matrix spectrum. International University Bremen. 2004.
Gallicchio C, Micheli A. Tree echo state networks. Neurocomputing 2013;101(3):319–37.
Song Q S, Feng ZR. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73:2177–85.
Gao Z K, Jin ND. A directed weighted complex network for characterizing chaotic dynamics from time series. Nonlinear Anal Real World Appl 2012;13:947–52.
Gao Z K, Jin ND. Complex network from time series based on phase space reconstruction. Chaos Interdisc J Nonlinear Sci 2009;19(3):375–93.
Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2006;18(5):1364–75.
Ma Q L, Chen WB. Modular state space of echo state network. Neurocomputing 2013;122(122):406–17.
Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2012;18(5):1364–75.
Yang B, Deng ZD. An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction. Front Electr Electron Eng 2012;7(2):200–07.
Song Q, Feng Z. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73(10–12):2177–85.
Li X M, Zhong L, Xue FZ. A priori data-driven multi-clustered reservoir generation algorithm for echo state network. PLOS ONE 2015;10(4):e0120750.
Turrigiano G G, Nelson SB. Homeostatic plasticity in the developing nervous system. Nat Rev Neurosci 2004;5 (2):97–107.
Kourrich S, Calu D J, Bonci A. Intrinsic plasticity: an emerging player in addiction. Nat Rev Neurosci 2015;16(3):173–84.
Watt A J, Han NS. Homeostatic plasticity and STDP: keeping a neuron’s cool in a fluctuating world. Front Synaptic Neurosci. 2010;2(5).
Triesch J. A gradient rule for the plasticity of a neuron’s intrinsic excitability. Artificial Neural Networks. Biological inspirations — ICANN; 2005. p. 65–70.
Steil J J. Online reservoir adaptation by intrinsic plasticity for backprogation-decorrelation and echo state learning. Neural Netw Off J Int Neural Netw Soc 2007;20(3):353–64.
Li C G. A model of neuronal intrinsic plasticity. IEEE Trans Auton Ment Dev 2011;3(4):277–84.
Koprinkova-Hristova P. On effects of IP improvement of ESN reservoirs for reflecting of data structure. In: International joint conference on neural networks. IEEE; 2015. p. 1–7.
Koprinkovahristova P. On-line training of ESN and IP tuning effect. Lect Notes Comput Sci 2014;8681:25–32.
Nisbach F, Kaiser M. Developmental time windows for spatial growth generate multiple-cluster small-world networks. Eur Phys J 2007;23:185–91.
Maass M. Lower bounds for the computational power of networks of spiking neurons. Neural Computing; 1996. p. 1–40.
Baddeley R, Abbott L F, Booth M C, Sengpiel F, Freeman T. Responses of neurons in primary and inferior temporal visual cortices to natural scenes. Biol Sci 2014;264:1775–83.
Jaeger H. Reservoir riddles: suggestions for echo state network research: In: Proceedings IEEE international joint conference on neural networks, IJCNN’05. 2005, vol 3. p. 1460–1462.
Schrauwen B, Wardermann M, Verstraeten D, et al. Improving reservoirs using intrinsic plasticity 2008; 71(7–9):1159–71.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Funding
This work was funded by the National Natural Science Foundation of China (Nos. 61473051), Natural Science Foundation of Chongqing (Nos. cstc2016jcyjA0015).
Conflict of Interest
The authors declare that they have no conflict of interest.
Ethical approvals
This article does not contain any studies with human participants or animals performed by any of the authors.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Rights and permissions
About this article
Cite this article
Xue, F., Li, Q., Zhou, H. et al. Reservoir Computing with Both Neuronal Intrinsic Plasticity and Multi-Clustered Structure. Cogn Comput 9, 400–410 (2017). https://doi.org/10.1007/s12559-017-9467-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12559-017-9467-3