Skip to main content

Advertisement

Log in

Reservoir Computing with Both Neuronal Intrinsic Plasticity and Multi-Clustered Structure

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

In the echo state networks, both reservoir states and network structure are essential for the performance of reservoir computing. In neuroscience, it has been confirmed that a single neuron can adaptively change its intrinsic excitability to fit various synaptic inputs. This mechanism is called intrinsic plasticity (IP) mechanism in the literature. This adaptive adjustment of neuronal response to external inputs is believed to maximize input-output mutual information. Meanwhile, the existence of multi-clustered structure with small-world-like property in the brain has been strongly supported by many neurophysiological experiments. Thus, it is advisable to consider both the intrinsic plasticity and multi-clustered structure of a reservoir network, rather than a random network with a non-adaptive reservoir response. In this paper, reservoir models with neuronal intrinsic plasticity and multi-clustered structure are investigated. The effects of two types of IP rules on the performance of several computational tasks have been investigated in detail by combining neuronal IP with multi-clustered reservoir structures. The first type is the Triesch’s IP rule, which drives the output activities of neurons to approximate exponential distributions; another is the Li’s IP rule, which generates a Gaussian distribution of neuronal firing. Results show that both the multi-clustered structures and IP rules can improve the computational accuracy of reservoir computing. However, before the application of the IP rules, the enhancement of computational performance for multi-clustered reservoirs is minor. Both IP rules contribute to improvement of the computational performance, where the Li’s IP rule is more advantageous than the Triesch’s IP. The results indicate that the combination of multi-clustered reservoir structures and IP learning can increase the dynamic diversity of reservoir states, especially for the IP’s learning. The adaptive tuning of reservoir states based on IP improves the dynamic complexity of neuronal activity, which helps train output weights. This biologically inspired reservoir model may give insights for the optimization of reservoir computing.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Jaeger H. A tutorial on training recurrent neural networks, covering BPTT, RURL, EKF and the echo state network approach, Journal, Technical Report GMD Report 159, German National Research Center for Information Technology. 2002.

  2. Jaeger H, Hass H. Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless telecommunication. Science 2004;5667:78–80.

    Article  Google Scholar 

  3. Li D, Han M, Wang J. Chaotic time series prediction based on a novel robust echo state network. IEEE Trans Neural Netw Learn Syst 2012;23(5):787–99.

    Article  PubMed  Google Scholar 

  4. Jaeger H. Adaptive nonlinear system identification with echo state networks. Advances in neural information processing systems; 2004. p. 78–80.

  5. Skowronski M D, Harris JG. Noise-robust automatic speech recognition using a predictive echo state network. IEEE Trans Audio Speech Lang Process 2007;15(5):1724–1730.

    Article  Google Scholar 

  6. Skowronski M D, Harris JG. Minimum mean squared error time series classification using an echo state network prediction model. In: IEEE International symposium on circuit system; 2006. p. 3153–3156.

  7. Scardapane S, Uncini A. Semi-supervised echo state networks for audio classification. Cogn Comput.; 2016. p. 1–11.

  8. Lin X, Yang Z, Song Y. Short-term stock price prediction based on echo state networks. Expert Syst Appl 2009;36(3):7313– 17.

    Article  Google Scholar 

  9. Meftah B, Lzoray O, Benyettou A. A novel approach using echo state networks for microscopic cellular image segmentation. Cogn Comput 2016;8(2):1–9.

    Article  Google Scholar 

  10. Tong M H, Bicket A D, Christiansen E M, Cottrell GW. Clustered complex echo state networks for traffic forecasting with prior knowledge, instrumentation and measurement technology conference, I2MTC; 2007. p. 1–5.

  11. Jaeger H, Lukoeviius M, Popovici D, Siewert U. Optimization and applications of echo state networks with leaky-integrator neurons. Neural Netw 2007;20(3):335–52.

    Article  PubMed  Google Scholar 

  12. Najibi E, Rostami H. SCESN, SPESN, SWESN: three recurrent neural echo state networks with clustered reservoirs for prediction of nonlinear and chaotic time series. Appl Intell 2015;43(2):460–72.

    Article  Google Scholar 

  13. Liebald B. Exploration of effects of different network topologies on the ESN signal crosscorrelation matrix spectrum. International University Bremen. 2004.

  14. Gallicchio C, Micheli A. Tree echo state networks. Neurocomputing 2013;101(3):319–37.

    Article  Google Scholar 

  15. Song Q S, Feng ZR. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73:2177–85.

    Article  Google Scholar 

  16. Gao Z K, Jin ND. A directed weighted complex network for characterizing chaotic dynamics from time series. Nonlinear Anal Real World Appl 2012;13:947–52.

    Article  Google Scholar 

  17. Gao Z K, Jin ND. Complex network from time series based on phase space reconstruction. Chaos Interdisc J Nonlinear Sci 2009;19(3):375–93.

    Google Scholar 

  18. Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2006;18(5):1364–75.

    Article  Google Scholar 

  19. Ma Q L, Chen WB. Modular state space of echo state network. Neurocomputing 2013;122(122):406–17.

    Article  Google Scholar 

  20. Deng Z D, Zhang Y. Collective behavior of a small-world recurrent neural system with scale-free distribution. IEEE Trans Neural Netw 2012;18(5):1364–75.

    Article  Google Scholar 

  21. Yang B, Deng ZD. An extended SHESN with leaky integrator neuron and inhibitory connection for Mackey-Glass prediction. Front Electr Electron Eng 2012;7(2):200–07.

    Google Scholar 

  22. Song Q, Feng Z. Effects of connectivity structure of complex echo state network on its prediction performance for nonlinear time series. Neurocomputing 2010;73(10–12):2177–85.

    Article  Google Scholar 

  23. Li X M, Zhong L, Xue FZ. A priori data-driven multi-clustered reservoir generation algorithm for echo state network. PLOS ONE 2015;10(4):e0120750.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Turrigiano G G, Nelson SB. Homeostatic plasticity in the developing nervous system. Nat Rev Neurosci 2004;5 (2):97–107.

    Article  CAS  PubMed  Google Scholar 

  25. Kourrich S, Calu D J, Bonci A. Intrinsic plasticity: an emerging player in addiction. Nat Rev Neurosci 2015;16(3):173–84.

    Article  CAS  PubMed  Google Scholar 

  26. Watt A J, Han NS. Homeostatic plasticity and STDP: keeping a neuron’s cool in a fluctuating world. Front Synaptic Neurosci. 2010;2(5).

  27. Triesch J. A gradient rule for the plasticity of a neuron’s intrinsic excitability. Artificial Neural Networks. Biological inspirations — ICANN; 2005. p. 65–70.

  28. Steil J J. Online reservoir adaptation by intrinsic plasticity for backprogation-decorrelation and echo state learning. Neural Netw Off J Int Neural Netw Soc 2007;20(3):353–64.

    Article  Google Scholar 

  29. Li C G. A model of neuronal intrinsic plasticity. IEEE Trans Auton Ment Dev 2011;3(4):277–84.

    Article  Google Scholar 

  30. Koprinkova-Hristova P. On effects of IP improvement of ESN reservoirs for reflecting of data structure. In: International joint conference on neural networks. IEEE; 2015. p. 1–7.

  31. Koprinkovahristova P. On-line training of ESN and IP tuning effect. Lect Notes Comput Sci 2014;8681:25–32.

    Article  Google Scholar 

  32. Nisbach F, Kaiser M. Developmental time windows for spatial growth generate multiple-cluster small-world networks. Eur Phys J 2007;23:185–91.

    Article  Google Scholar 

  33. Maass M. Lower bounds for the computational power of networks of spiking neurons. Neural Computing; 1996. p. 1–40.

  34. Baddeley R, Abbott L F, Booth M C, Sengpiel F, Freeman T. Responses of neurons in primary and inferior temporal visual cortices to natural scenes. Biol Sci 2014;264:1775–83.

    Article  Google Scholar 

  35. Jaeger H. Reservoir riddles: suggestions for echo state network research: In: Proceedings IEEE international joint conference on neural networks, IJCNN’05. 2005, vol 3. p. 1460–1462.

  36. Schrauwen B, Wardermann M, Verstraeten D, et al. Improving reservoirs using intrinsic plasticity 2008; 71(7–9):1159–71.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiumin Li.

Ethics declarations

Funding

This work was funded by the National Natural Science Foundation of China (Nos. 61473051), Natural Science Foundation of Chongqing (Nos. cstc2016jcyjA0015).

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical approvals

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xue, F., Li, Q., Zhou, H. et al. Reservoir Computing with Both Neuronal Intrinsic Plasticity and Multi-Clustered Structure. Cogn Comput 9, 400–410 (2017). https://doi.org/10.1007/s12559-017-9467-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-017-9467-3

Keywords

Navigation