Abstract
Coming up with a model which matches biological observations more closely has always been one of the main challenges in the field of artificial neural networks. Lately, an ionic model of reservoir networks containing spiking neurons (ILS-based reservoir network) has been proposed which seems to replicate some of the biological processes we have observed up until now. This paper presents a local learning rule for the ILS-based reservoir inspired by the biological fact that each incoming stimulus causes the formation of new dendritic spines, producing new synapses. This property may result in a higher degree of neuroplasticity, leading to a higher learning capacity. To evaluate the proposed learning rule, which somehow takes the structural plasticity into account, several experiments have been designed that show its stability and its effectiveness on separation property, classification accuracy, and reliability. Several benchmark cases have also been discussed and their results through separation and classification evaluation metrics, demonstrate the superior performance of the ILS-based reservoir network trained by the proposed learning rule compared to the untrained similar network.
Similar content being viewed by others
References
Abbott LF (1999) Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res Bull 50(5–6):303–4
Auer P, Burgsteiner H, Maass W (2008) A learning rule for very simple universal approximators consisting of a single layer of perceptrons. Neural Netw Off J Int Neural Netw Soc 21:786–95. https://doi.org/10.1016/j.neunet.2007.12.036
Baydogan M, Runger G (2015) Time series representation and similarity based on local autopatterns. Data Min Knowl Discov 30:1–34. https://doi.org/10.1007/s10618-015-0425-y
Qiang Bi G, Ming Poo M (2001) Synaptic modification by correlated activity: Hebb’s postulate revisited. Ann Rev Neurosci 24(1):139–166. https://doi.org/10.1146/annurev.neuro.24.1.139
Bienenstock E, Cooper L, Munro P (1982) Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J Neurosci 2(1):32–48. https://doi.org/10.1523/JNEUROSCI.02-01-00032.1982
CMU (2012) Graphics lab motion capture database. https://mocap.cs.cmu.edu
Cohen GK, Orchard G, Leng SH et al (2016) Skimming digits: neuromorphic classification of spike-encoded images. Front Neurosci 10:1–11. https://doi.org/10.3389/fnins.2016.00184
Colace F, Loia V, Tomasiello S (2019) Revising recurrent neural networks from a granular perspective. Appl Soft Comput 82(105):535. https://doi.org/10.1016/j.asoc.2019.105535
Daoudal G, Debanne D (2003) Long-term plasticity of intrinsic excitability: learning rules and mechanisms. Learn Mem 10(6):456–65
Desai N, Rutherford L, Turrigiano G (1999) Plasticity in the intrinsic excitability of cortical pyramidal neurons. Nat Neurosci 2(6):515–520. https://doi.org/10.1038/9165
Diehl P, Cook M (2016) Learning and inferring relations in cortical networks arXiv:1608.08267
Dua D, Graff C (2017) Uci machine learning repository. http://archive.ics.uci.edu/ml
Gardner B, Sporea I, Grüning A (2015) Learning spatiotemporally encoded pattern transformations in structured spiking neural networks. Neural Comput 27(12):2548–2586. https://doi.org/10.1162/NECO_a_00790
George R, Diehl P, Cook M et al (2015) Modeling the interplay between structural plasticity and spike-timing-dependent plasticity. BMC Neurosci 16:P107. https://doi.org/10.1186/1471-2202-16-S1-P107
George R, Mayr C, Indiveri G, et al (2015) Event-based softcore processor in a biohybrid setup applied to structural plasticity. In: 2015 International conference on event-based control, communication, and signal processing (EBCCSP), pp 1–4, https://doi.org/10.1109/EBCCSP.2015.7300664
Gerstner W, Kistler W (2002) Spiking neuron models: an introduction. Cambridge University Press, New York, NY, USA
Gollisch T, Meister M (2008) Rapid neural coding in the retina with relative spike latencies. Science 319(5866):1108–1111. https://doi.org/10.1126/science.1149639
Goodman E, Ventura D (2006) Spatiotemporal pattern recognition via liquid state machines. In: The 2006 IEEE international joint conference on neural network proceedings, pp 3848–3853, https://doi.org/10.1109/IJCNN.2006.246880
Hazan H, Manevitz LM (2012) Topological constraints and robustness in liquid state machines. Expert Syst Appl 39:1597–1606
Hebb D (1949) The organization of behavior. Wiley, Hoboken
Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117(4):500–44. https://doi.org/10.1126/science.1149639
Hourdakis E, Trahanias PE (2013) Use of the separation property to derive liquid state machines with enhanced classification performance. Neurocomputing 107:40–48
Iranmehr E, Bagheri Shouraki S, Bagheri N et al (2019) Bio-inspired evolutionary model of spiking neural networks in ionic liquid space. Front Neurosci. https://doi.org/10.3389/fnins.2019.01085
Iranmehr E, Baghri Shouraki S, Faraji MM (2020) Ils-based reservoir computing for handwritten digits recognition. In: 2020 8th Iranian joint congress on fuzzy and intelligent systems (CFIS), pp 1–6, https://doi.org/10.1109/CFIS49607.2020.9238722
Izhikevich EM (2003) Simple model of spiking neurons. Trans Neur Netw 14(6):1569–1572. https://doi.org/10.1109/TNN.2003.820440
Jackson Z (2016) Free spoken digit dataset (fsdd). Tech Rep. https://doi.org/10.5281/zendo.1342401
Jaeger H (2001) The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, p. 34
Johansson U, Gabrielsson P (2019) Are traditional neural networks well-calibrated? In: 2019 international joint conference on neural networks (IJCNN), pp 1–8, https://doi.org/10.1109/IJCNN.2019.8851962
Garofolo JS, Lamel L, Fisher W, Fiscus J, Pallett D, Dahlgren N, Zue V (1993) TIMIT Acoustic-Phonetic Continuous Speech Corpus. https://hdl.handle.net/11272.1/AB2/SWVENO. Abacus Data Network, V1
Ju H, Xu JX, Chong E et al (2013) Effects of synaptic connectivity on liquid state machine performance. Neural Netw 38:39–51. https://doi.org/10.1016/j.neunet.2012.11.003
Kuhlmann L, Hauser-Raspe M, Manton HJ et al (2013) Approximate, computationally efficient online learning in bayesian spiking neurons. Neural Comput. https://doi.org/10.1162/NECO_a_00560
Lazar A, Pipa G, Triesch J (2007) Fading memory and time series prediction in recurrent networks with different forms of plasticity. Neural Netw 20(3):312–322. https://doi.org/10.1016/j.neunet.2007.04.020
LeCun Y, Bottou L, Bengio Y et al (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324
Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci. https://doi.org/10.3389/fnins.2016.00508
Maass W (2000) On the computational power of winner-take-all. Neural Comput 12(11):2519–2535. https://doi.org/10.1162/089976600300014827
Maass W, Natschlager T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560
Maass W, Legenstein R, Bertschinger N (2005) Methods for estimating the computational power and generalization capability of neural microcircuits. Advances in neural information processing systems. MIT Press, Cambridge, pp 865–872
Mohemmed A, Schliebs S, Matsuda S et al (2012) Span: Spike pattern association neuron for learning spatio-temporal spike patterns. Int J Neural Syst 22(1250):012. https://doi.org/10.1142/S0129065712500128
Murakami M, Honda N (2007) A study on the modeling ability of the ids method: a soft computing technique using pattern-based information processing. Int J Approx Reason 45(3):470–487
Møller MF (1993) A scaled conjugate gradient algorithm for fast supervised learning. Neural Netw 6(4):525–533. https://doi.org/10.1016/S0893-6080(05)80056-5
Natschläger T, Maass W, Markram H (2002) The “liquid computer”: a novel strategy for real-time computing on time series. Special issue on foundations of information processing of TELEMATIK 8(1):39–43. http://infoscience.epfl.ch/record/117806https://arxiv.org/abs/arXiv:1011.1669v3
Natschläger T, Markram H, Maass W (2003) Computer models and analysis tools for neural microcircuits. Springer, US, Boston, MA, pp 123–138
Neil D, Liu S (2016) Effective sensor fusion with event-based sensors and deep network architectures. In: 2016 IEEE international symposium on circuits and systems (ISCAS), pp 2282–2285, https://doi.org/10.1109/ISCAS.2016.7539039
Niculescu-Mizil A, Caruana R (2005) Predicting good probabilities with supervised learning. In: Proceedings of the 22nd international conference on machine learning. Association for computing machinery, New York, NY, USA, ICML ’05, p 625–632, https://doi.org/10.1145/1102351.1102430
Norton D, Ventura D (2010) Improving liquid state machines through iterative refinement of the reservoir. Neurocomputing 73(16):2893–2904. https://doi.org/10.1016/j.neucom.2010.08.005
Notley S, Gruning A (2012) Improved spike-timed mappings using a tri-phasic spike timing-dependent plasticity rule. In: The 2012 international joint conference on neural networks (IJCNN), pp 1–6, https://doi.org/10.1109/IJCNN.2012.6252773
Oja E (1982) Simplified neuron model as a principal component analyzer. J Math Biol 15(3):267–273. https://doi.org/10.1007/BF00275687
Olszewski RT (2012) http://www.cs.cmu.edu/~bobski/
Orchard G, Jayawan A, Cohen GK et al (2015) Converting static image datasets to spiking neuromorphic datasets using saccades. Front Neurosci 9:437. https://doi.org/10.3389/fnins.2015.00437
Panchev C, Wermter S (2004) Spike-timing-dependent synaptic plasticity: from single spikes to spike trains. Neurocomputing 58–60:365–371
Panda P, Roy K (2017) Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks. Front Neurosci 11:693. https://doi.org/10.3389/fnins.2017.00693
Ponulak F (2005) Resume-new supervised learning method for spiking neural networks, technical report, institute of control and information engineering, poznan university of technology. Available at http://d1.cie.put.poznan.pl/~fp/
Ponulak F, Kasiński A (2010) Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting. Neural Comput 22(2):467–510. https://doi.org/10.1162/neco.2009.11-08-901
Racca A, Magri L (2021) Robust optimization and validation of echo state networks for learning chaotic dynamics. Neural Netw 142:252–268. https://doi.org/10.1016/j.neunet.2021.05.004
Rhéaume F, Grenier D, Bossé E (2011) Multistate combination approaches for liquid state machine in supervised spatiotemporal pattern classification. Neurocomputing 74:2842–2851. https://doi.org/10.1016/j.neucom.2011.03.033
Roberts T, Tschida K, Klein M et al (2010) Rapid spine stabilization and synaptic enhancement at the onset of behavioural learning. Nature 463(7283):948–952. https://doi.org/10.1038/nature08759
Roy S, Basu A (2016) An online structural plasticity rule for generating better reservoirs. Neural Comput 28(11):2557–2584. https://doi.org/10.1162/NECO_a_00886
Roy S, Basu A (2017) An online unsupervised structural plasticity algorithm for spiking neural networks. IEEE Trans Neural Netw Learn Syst 28(4):900–910. https://doi.org/10.1109/TNNLS.2016.2582517
Roy S, San PP, Hussain S et al (2016) Learning spike time codes through morphological learning with binary synapses. IEEE Trans Neural Netw Learn Syst 27(7):1572–1577. https://doi.org/10.1109/TNNLS.2015.2447011
Schrauwen B, Verstraeten D, Van Campenhout J (2007) An overview of reservoir computing: theory, applications and implementations. In: Proceedings of the 15th European symposium on artificial neural networks (April):471–82. 1854/11063
Shouraki SB, Honda N (1997) A new method for establishing and saving fuzzy membership function. In: 13th Symposium of FUZZY systems. pp 91–94
Shouraki SB, Honda N (1999) Recursive fuzzy modeling based on fuzzy interpolation. JACIII 3:114–125
Sillin HO, Aguilera R, Shieh HH et al (2013) A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24(38):384,004. https://doi.org/10.1088/0957-4484/24/38/384004
Song S, Miller KD, Abbott LF (2000) Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Ann Rev Neurosci 3:919–926. https://doi.org/10.1038/78829
Sporea I, Grüning A (2013) Supervised learning in multilayer spiking neural networks. Neural Comput 25(2):473–509. https://doi.org/10.1162/NECO_a_00396
Stein RB (1965) A theoretical analysis of neuronal variability. Biophys J 5:173–194
Stromatias E, Soto M, Serrano-Gotarredona T et al (2017) An event-driven classifier for spiking neural networks fed with synthetic or dynamic vision sensor data. Front Neurosci 11:350. https://doi.org/10.3389/fnins.2017.00350
Tomasiello S, Loia V, Khaliq A (2021) A granular recurrent neural network for multiple time series prediction. Neural Comput Appl 33:1–18. https://doi.org/10.1007/s00521-021-05791-4
Tschida KA, Mooney R (2012) Deafening drives cell-type-specific changes to dendritic spines in a sensorimotor nucleus important to learned vocalizations. Neuron 73:1028–1039
van Schaik A, Tapson J (2015) Online and adaptive pseudoinverse solutions for elm weights. Neurocomputing 149:233–238. https://doi.org/10.1016/j.neucom.2014.01.071
Vlachas P, Pathak J, Hunt B et al (2020) Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics. Neural Netw 126:191–217. https://doi.org/10.1016/j.neunet.2020.02.016
Wang W, Zhu JZ, Chang KT et al (2012) Dscr1 interacts with fmrp and is required for spine morphogenesis and local protein synthesis. EMBO J 31(18):3655–3666. https://doi.org/10.1038/emboj.2012.190
Wojcik GM (2012) Electrical parameters influence on the dynamics of the hodgkin-huxley liquid state machine. Neurocomput 79:68–74. https://doi.org/10.1016/j.neucom.2011.10.007
Wu Y, Deng L, Li G et al (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331. https://doi.org/10.3389/fnins.2018.00331
Xue F, Hou Z, Li X (2013) Computational capability of liquid state machines with spike-timing-dependent plasticity. Neurocomputing 122:324–329
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that there is no conflict of interest exists between authors in the publication of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Iranmehr, E., Shouraki, S.B. & Faraji, M. Developing a structural-based local learning rule for classification tasks using ionic liquid space-based reservoir. Neural Comput & Applic 34, 15075–15093 (2022). https://doi.org/10.1007/s00521-022-07345-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-022-07345-8