Abstract
Liquid State Machines (LSMs) are a computational model of spiking neural networks with recurrent connections in a reservoir. Although they are believed to be biologically more plausible, LSMs have not yet been as successful as other artificial neural networks in solving real world learning problems mainly due to their highly sensitive learning performance to different types of stimuli. To address this issue, a covariance matrix adaptation evolution strategy has been adopted in this paper to optimize the topology and parameters of the LSM, thereby sparing the arduous task of fine tuning the parameters of the LSM for different tasks. The performance of the evolved LSM is demonstrated on three complex real-world pattern classification problems including image recognition and spatio-temporal classification.
This work was supported by the National Natural Science Foundation of China under Grand 61525302, 61590922, the National Key Research and Development Program of China under Grant 2018YFB1701104, the Project of Ministry of Industry and Information Technology of China under Grand 20171122-6, and the Fundamental Research Funds for the Central Universities under Grand N160801001, N161608001.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Carlson, K.D., Nageswaran, J.M., Dutt, N., Krichmar, J.L.: An efficient automated parameter tuning framework for spiking neural networks. Front. Neurosci. 8, 10 (2014)
Chrol-Cannon, J., Jin, Y.: Learning structure of sensory inputs with synaptic plasticity leads to interference. Front. Comput. Neurosci. 9, 103 (2015)
Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)
Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
Jin, Y., Wen, R., Sendhoff, B.: Evolutionary multi-objective optimization of spiking neural networks. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds.) ICANN 2007. LNCS, vol. 4668, pp. 370–379. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-74690-4_38
Kasabov, N., et al.: Evolving spiking neural networks for personalised modelling, classification and prediction of spatio-temporal patterns with a case study on stroke. Neurocomputing 134, 269–279 (2014)
Kudo, M., Toyama, J., Shimbo, M.: Multidimensional curve classification using passing-through regions. Pattern Recogn. Lett. 20(11), 1103–1111 (1999)
Lake, B.M., Salakhutdinov, R., Tenenbaum, J.B.: Human-level concept learning through probabilistic program induction. Science 350(6266), 1332–1338 (2015)
Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002)
Marblestone, A.H., Wayne, G., Kording, K.P.: Toward an integration of deep learning and neuroscience. Front. Comput. Neurosci. 10, 94 (2016)
Meng, Y., Jin, Y., Yin, J.: Modeling activity-dependent plasticity in BCM spiking neural networks with application to human behavior recognition. IEEE Trans. Neural Netw. 22(12), 1952–1966 (2011)
Panda, P., Roy, K.: Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks. Front. Neurosci. 11, 693 (2017)
Schliebs, S., Kasabov, N.: Evolving spiking neural network a survey. Evolving Syst. 4(2), 87–98 (2013)
Schuldt, C., Laptev, I., Caputo, B.: Recognizing human actions: a local SVM approach. In: Proceedings of the 17th International Conference on Pattern Recognition, 2004, vol. 3, pp. 32–36 (2004)
Song, S., Miller, K.D., Abbott, L.F.: Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Nat. Neurosci. 3(9), 919–926 (2000)
Stimberg, M., Goodman, D., Benichoux, V., Brette, R.: Equation-oriented specification of neural models for simulations. Front. Neuroinf. 8, 6 (2014)
Wu, Q.X., McGinnity, T.M., Maguire, L.P., Glackin, B., Belatreche, A.: Learning under weight constraints in networks of temporal encoding spiking neurons. Neurocomputing 69(16), 1912–1922 (2006)
Xu, Q., Qi, Y., Yu, H., Shen, J., Tang, H., Pan, G.: CSNN: an augmented spiking based framework with perceptron-inception. In: IJCAI, pp. 1646–1652 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhou, Y., Jin, Y., Ding, J. (2019). Evolutionary Optimization of Liquid State Machines for Robust Learning. In: Lu, H., Tang, H., Wang, Z. (eds) Advances in Neural Networks – ISNN 2019. ISNN 2019. Lecture Notes in Computer Science(), vol 11554. Springer, Cham. https://doi.org/10.1007/978-3-030-22796-8_41
Download citation
DOI: https://doi.org/10.1007/978-3-030-22796-8_41
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22795-1
Online ISBN: 978-3-030-22796-8
eBook Packages: Computer ScienceComputer Science (R0)