Skip to main content

Evolutionary Optimization of Liquid State Machines for Robust Learning

  • Conference paper
  • First Online:
Advances in Neural Networks – ISNN 2019 (ISNN 2019)

Abstract

Liquid State Machines (LSMs) are a computational model of spiking neural networks with recurrent connections in a reservoir. Although they are believed to be biologically more plausible, LSMs have not yet been as successful as other artificial neural networks in solving real world learning problems mainly due to their highly sensitive learning performance to different types of stimuli. To address this issue, a covariance matrix adaptation evolution strategy has been adopted in this paper to optimize the topology and parameters of the LSM, thereby sparing the arduous task of fine tuning the parameters of the LSM for different tasks. The performance of the evolved LSM is demonstrated on three complex real-world pattern classification problems including image recognition and spatio-temporal classification.

This work was supported by the National Natural Science Foundation of China under Grand 61525302, 61590922, the National Key Research and Development Program of China under Grant 2018YFB1701104, the Project of Ministry of Industry and Information Technology of China under Grand 20171122-6, and the Fundamental Research Funds for the Central Universities under Grand N160801001, N161608001.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Carlson, K.D., Nageswaran, J.M., Dutt, N., Krichmar, J.L.: An efficient automated parameter tuning framework for spiking neural networks. Front. Neurosci. 8, 10 (2014)

    Google Scholar 

  2. Chrol-Cannon, J., Jin, Y.: Learning structure of sensory inputs with synaptic plasticity leads to interference. Front. Comput. Neurosci. 9, 103 (2015)

    Google Scholar 

  3. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)

    Google Scholar 

  4. Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)

    Google Scholar 

  5. Jin, Y., Wen, R., Sendhoff, B.: Evolutionary multi-objective optimization of spiking neural networks. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 370–379. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74690-4_38

    Google Scholar 

  6. Kasabov, N., et al.: Evolving spiking neural networks for personalised modelling, classification and prediction of spatio-temporal patterns with a case study on stroke. Neurocomputing 134, 269–279 (2014)

    Google Scholar 

  7. Kudo, M., Toyama, J., Shimbo, M.: Multidimensional curve classification using passing-through regions. Pattern Recogn. Lett. 20(11), 1103–1111 (1999)

    Google Scholar 

  8. Lake, B.M., Salakhutdinov, R., Tenenbaum, J.B.: Human-level concept learning through probabilistic program induction. Science 350(6266), 1332–1338 (2015)

    Google Scholar 

  9. Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002)

    Google Scholar 

  10. Marblestone, A.H., Wayne, G., Kording, K.P.: Toward an integration of deep learning and neuroscience. Front. Comput. Neurosci. 10, 94 (2016)

    Google Scholar 

  11. Meng, Y., Jin, Y., Yin, J.: Modeling activity-dependent plasticity in BCM spiking neural networks with application to human behavior recognition. IEEE Trans. Neural Netw. 22(12), 1952–1966 (2011)

    Google Scholar 

  12. Panda, P., Roy, K.: Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks. Front. Neurosci. 11, 693 (2017)

    Google Scholar 

  13. Schliebs, S., Kasabov, N.: Evolving spiking neural network a survey. Evolving Syst. 4(2), 87–98 (2013)

    Google Scholar 

  14. Schuldt, C., Laptev, I., Caputo, B.: Recognizing human actions: a local SVM approach. In: Proceedings of the 17th International Conference on Pattern Recognition, 2004, vol. 3, pp. 32–36 (2004)

    Google Scholar 

  15. Song, S., Miller, K.D., Abbott, L.F.: Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3(9), 919–926 (2000)

    Google Scholar 

  16. Stimberg, M., Goodman, D., Benichoux, V., Brette, R.: Equation-oriented specification of neural models for simulations. Front. Neuroinf. 8, 6 (2014)

    Google Scholar 

  17. Wu, Q.X., McGinnity, T.M., Maguire, L.P., Glackin, B., Belatreche, A.: Learning under weight constraints in networks of temporal encoding spiking neurons. Neurocomputing 69(16), 1912–1922 (2006)

    Google Scholar 

  18. Xu, Q., Qi, Y., Yu, H., Shen, J., Tang, H., Pan, G.: CSNN: an augmented spiking based framework with perceptron-inception. In: IJCAI, pp. 1646–1652 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yaochu Jin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhou, Y., Jin, Y., Ding, J. (2019). Evolutionary Optimization of Liquid State Machines for Robust Learning. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11554. Springer, Cham. https://doi.org/10.1007/978-3-030-22796-8_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-22796-8_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-22795-1

  • Online ISBN: 978-3-030-22796-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics