Skip to main content
Log in

Developing a structural-based local learning rule for classification tasks using ionic liquid space-based reservoir

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Coming up with a model which matches biological observations more closely has always been one of the main challenges in the field of artificial neural networks. Lately, an ionic model of reservoir networks containing spiking neurons (ILS-based reservoir network) has been proposed which seems to replicate some of the biological processes we have observed up until now. This paper presents a local learning rule for the ILS-based reservoir inspired by the biological fact that each incoming stimulus causes the formation of new dendritic spines, producing new synapses. This property may result in a higher degree of neuroplasticity, leading to a higher learning capacity. To evaluate the proposed learning rule, which somehow takes the structural plasticity into account, several experiments have been designed that show its stability and its effectiveness on separation property, classification accuracy, and reliability. Several benchmark cases have also been discussed and their results through separation and classification evaluation metrics, demonstrate the superior performance of the ILS-based reservoir network trained by the proposed learning rule compared to the untrained similar network.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Abbott LF (1999) Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res Bull 50(5–6):303–4

    Article  Google Scholar 

  2. Auer P, Burgsteiner H, Maass W (2008) A learning rule for very simple universal approximators consisting of a single layer of perceptrons. Neural Netw Off J Int Neural Netw Soc 21:786–95. https://doi.org/10.1016/j.neunet.2007.12.036

    Article  MATH  Google Scholar 

  3. Baydogan M, Runger G (2015) Time series representation and similarity based on local autopatterns. Data Min Knowl Discov 30:1–34. https://doi.org/10.1007/s10618-015-0425-y

    Article  MathSciNet  MATH  Google Scholar 

  4. Qiang Bi G, Ming Poo M (2001) Synaptic modification by correlated activity: Hebb’s postulate revisited. Ann Rev Neurosci 24(1):139–166. https://doi.org/10.1146/annurev.neuro.24.1.139

    Article  Google Scholar 

  5. Bienenstock E, Cooper L, Munro P (1982) Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J Neurosci 2(1):32–48. https://doi.org/10.1523/JNEUROSCI.02-01-00032.1982

    Article  Google Scholar 

  6. CMU (2012) Graphics lab motion capture database. https://mocap.cs.cmu.edu

  7. Cohen GK, Orchard G, Leng SH et al (2016) Skimming digits: neuromorphic classification of spike-encoded images. Front Neurosci 10:1–11. https://doi.org/10.3389/fnins.2016.00184

    Article  Google Scholar 

  8. Colace F, Loia V, Tomasiello S (2019) Revising recurrent neural networks from a granular perspective. Appl Soft Comput 82(105):535. https://doi.org/10.1016/j.asoc.2019.105535

    Article  Google Scholar 

  9. Daoudal G, Debanne D (2003) Long-term plasticity of intrinsic excitability: learning rules and mechanisms. Learn Mem 10(6):456–65

    Article  Google Scholar 

  10. Desai N, Rutherford L, Turrigiano G (1999) Plasticity in the intrinsic excitability of cortical pyramidal neurons. Nat Neurosci 2(6):515–520. https://doi.org/10.1038/9165

    Article  Google Scholar 

  11. Diehl P, Cook M (2016) Learning and inferring relations in cortical networks arXiv:1608.08267

  12. Dua D, Graff C (2017) Uci machine learning repository. http://archive.ics.uci.edu/ml

  13. Gardner B, Sporea I, Grüning A (2015) Learning spatiotemporally encoded pattern transformations in structured spiking neural networks. Neural Comput 27(12):2548–2586. https://doi.org/10.1162/NECO_a_00790

    Article  MathSciNet  MATH  Google Scholar 

  14. George R, Diehl P, Cook M et al (2015) Modeling the interplay between structural plasticity and spike-timing-dependent plasticity. BMC Neurosci 16:P107. https://doi.org/10.1186/1471-2202-16-S1-P107

    Article  Google Scholar 

  15. George R, Mayr C, Indiveri G, et al (2015) Event-based softcore processor in a biohybrid setup applied to structural plasticity. In: 2015 International conference on event-based control, communication, and signal processing (EBCCSP), pp 1–4, https://doi.org/10.1109/EBCCSP.2015.7300664

  16. Gerstner W, Kistler W (2002) Spiking neuron models: an introduction. Cambridge University Press, New York, NY, USA

    Book  Google Scholar 

  17. Gollisch T, Meister M (2008) Rapid neural coding in the retina with relative spike latencies. Science 319(5866):1108–1111. https://doi.org/10.1126/science.1149639

    Article  Google Scholar 

  18. Goodman E, Ventura D (2006) Spatiotemporal pattern recognition via liquid state machines. In: The 2006 IEEE international joint conference on neural network proceedings, pp 3848–3853, https://doi.org/10.1109/IJCNN.2006.246880

  19. Hazan H, Manevitz LM (2012) Topological constraints and robustness in liquid state machines. Expert Syst Appl 39:1597–1606

    Article  Google Scholar 

  20. Hebb D (1949) The organization of behavior. Wiley, Hoboken

    Google Scholar 

  21. Hodgkin AL, Huxley AF (1952) A quantitative description of membrane current and its application to conduction and excitation in nerve. J Physiol 117(4):500–44. https://doi.org/10.1126/science.1149639

    Article  Google Scholar 

  22. Hourdakis E, Trahanias PE (2013) Use of the separation property to derive liquid state machines with enhanced classification performance. Neurocomputing 107:40–48

    Article  Google Scholar 

  23. Iranmehr E, Bagheri Shouraki S, Bagheri N et al (2019) Bio-inspired evolutionary model of spiking neural networks in ionic liquid space. Front Neurosci. https://doi.org/10.3389/fnins.2019.01085

    Article  Google Scholar 

  24. Iranmehr E, Baghri Shouraki S, Faraji MM (2020) Ils-based reservoir computing for handwritten digits recognition. In: 2020 8th Iranian joint congress on fuzzy and intelligent systems (CFIS), pp 1–6, https://doi.org/10.1109/CFIS49607.2020.9238722

  25. Izhikevich EM (2003) Simple model of spiking neurons. Trans Neur Netw 14(6):1569–1572. https://doi.org/10.1109/TNN.2003.820440

    Article  MathSciNet  Google Scholar 

  26. Jackson Z (2016) Free spoken digit dataset (fsdd). Tech Rep. https://doi.org/10.5281/zendo.1342401

    Article  Google Scholar 

  27. Jaeger H (2001) The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report, vol. 148, p. 34

  28. Johansson U, Gabrielsson P (2019) Are traditional neural networks well-calibrated? In: 2019 international joint conference on neural networks (IJCNN), pp 1–8, https://doi.org/10.1109/IJCNN.2019.8851962

  29. Garofolo JS, Lamel L, Fisher W, Fiscus J, Pallett D, Dahlgren N, Zue V (1993) TIMIT Acoustic-Phonetic Continuous Speech Corpus. https://hdl.handle.net/11272.1/AB2/SWVENO. Abacus Data Network, V1

  30. Ju H, Xu JX, Chong E et al (2013) Effects of synaptic connectivity on liquid state machine performance. Neural Netw 38:39–51. https://doi.org/10.1016/j.neunet.2012.11.003

    Article  Google Scholar 

  31. Kuhlmann L, Hauser-Raspe M, Manton HJ et al (2013) Approximate, computationally efficient online learning in bayesian spiking neurons. Neural Comput. https://doi.org/10.1162/NECO_a_00560

    Article  MATH  Google Scholar 

  32. Lazar A, Pipa G, Triesch J (2007) Fading memory and time series prediction in recurrent networks with different forms of plasticity. Neural Netw 20(3):312–322. https://doi.org/10.1016/j.neunet.2007.04.020

    Article  MATH  Google Scholar 

  33. LeCun Y, Bottou L, Bengio Y et al (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  34. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci. https://doi.org/10.3389/fnins.2016.00508

    Article  Google Scholar 

  35. Maass W (2000) On the computational power of winner-take-all. Neural Comput 12(11):2519–2535. https://doi.org/10.1162/089976600300014827

    Article  Google Scholar 

  36. Maass W, Natschlager T, Markram H (2002) Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput 14(11):2531–2560

    Article  Google Scholar 

  37. Maass W, Legenstein R, Bertschinger N (2005) Methods for estimating the computational power and generalization capability of neural microcircuits. Advances in neural information processing systems. MIT Press, Cambridge, pp 865–872

    Google Scholar 

  38. Mohemmed A, Schliebs S, Matsuda S et al (2012) Span: Spike pattern association neuron for learning spatio-temporal spike patterns. Int J Neural Syst 22(1250):012. https://doi.org/10.1142/S0129065712500128

    Article  Google Scholar 

  39. Murakami M, Honda N (2007) A study on the modeling ability of the ids method: a soft computing technique using pattern-based information processing. Int J Approx Reason 45(3):470–487

    Article  Google Scholar 

  40. Møller MF (1993) A scaled conjugate gradient algorithm for fast supervised learning. Neural Netw 6(4):525–533. https://doi.org/10.1016/S0893-6080(05)80056-5

    Article  Google Scholar 

  41. Natschläger T, Maass W, Markram H (2002) The “liquid computer”: a novel strategy for real-time computing on time series. Special issue on foundations of information processing of TELEMATIK 8(1):39–43. http://infoscience.epfl.ch/record/117806https://arxiv.org/abs/arXiv:1011.1669v3

  42. Natschläger T, Markram H, Maass W (2003) Computer models and analysis tools for neural microcircuits. Springer, US, Boston, MA, pp 123–138

    Google Scholar 

  43. Neil D, Liu S (2016) Effective sensor fusion with event-based sensors and deep network architectures. In: 2016 IEEE international symposium on circuits and systems (ISCAS), pp 2282–2285, https://doi.org/10.1109/ISCAS.2016.7539039

  44. Niculescu-Mizil A, Caruana R (2005) Predicting good probabilities with supervised learning. In: Proceedings of the 22nd international conference on machine learning. Association for computing machinery, New York, NY, USA, ICML ’05, p 625–632, https://doi.org/10.1145/1102351.1102430

  45. Norton D, Ventura D (2010) Improving liquid state machines through iterative refinement of the reservoir. Neurocomputing 73(16):2893–2904. https://doi.org/10.1016/j.neucom.2010.08.005

    Article  Google Scholar 

  46. Notley S, Gruning A (2012) Improved spike-timed mappings using a tri-phasic spike timing-dependent plasticity rule. In: The 2012 international joint conference on neural networks (IJCNN), pp 1–6, https://doi.org/10.1109/IJCNN.2012.6252773

  47. Oja E (1982) Simplified neuron model as a principal component analyzer. J Math Biol 15(3):267–273. https://doi.org/10.1007/BF00275687

    Article  MathSciNet  MATH  Google Scholar 

  48. Olszewski RT (2012) http://www.cs.cmu.edu/~bobski/

  49. Orchard G, Jayawan A, Cohen GK et al (2015) Converting static image datasets to spiking neuromorphic datasets using saccades. Front Neurosci 9:437. https://doi.org/10.3389/fnins.2015.00437

    Article  Google Scholar 

  50. Panchev C, Wermter S (2004) Spike-timing-dependent synaptic plasticity: from single spikes to spike trains. Neurocomputing 58–60:365–371

    Article  Google Scholar 

  51. Panda P, Roy K (2017) Learning to generate sequences with combination of hebbian and non-hebbian plasticity in recurrent spiking neural networks. Front Neurosci 11:693. https://doi.org/10.3389/fnins.2017.00693

    Article  Google Scholar 

  52. Ponulak F (2005) Resume-new supervised learning method for spiking neural networks, technical report, institute of control and information engineering, poznan university of technology. Available at http://d1.cie.put.poznan.pl/~fp/

  53. Ponulak F, Kasiński A (2010) Supervised learning in spiking neural networks with resume: sequence learning, classification, and spike shifting. Neural Comput 22(2):467–510. https://doi.org/10.1162/neco.2009.11-08-901

    Article  MathSciNet  MATH  Google Scholar 

  54. Racca A, Magri L (2021) Robust optimization and validation of echo state networks for learning chaotic dynamics. Neural Netw 142:252–268. https://doi.org/10.1016/j.neunet.2021.05.004

    Article  Google Scholar 

  55. Rhéaume F, Grenier D, Bossé E (2011) Multistate combination approaches for liquid state machine in supervised spatiotemporal pattern classification. Neurocomputing 74:2842–2851. https://doi.org/10.1016/j.neucom.2011.03.033

    Article  Google Scholar 

  56. Roberts T, Tschida K, Klein M et al (2010) Rapid spine stabilization and synaptic enhancement at the onset of behavioural learning. Nature 463(7283):948–952. https://doi.org/10.1038/nature08759

    Article  Google Scholar 

  57. Roy S, Basu A (2016) An online structural plasticity rule for generating better reservoirs. Neural Comput 28(11):2557–2584. https://doi.org/10.1162/NECO_a_00886

    Article  MathSciNet  Google Scholar 

  58. Roy S, Basu A (2017) An online unsupervised structural plasticity algorithm for spiking neural networks. IEEE Trans Neural Netw Learn Syst 28(4):900–910. https://doi.org/10.1109/TNNLS.2016.2582517

    Article  Google Scholar 

  59. Roy S, San PP, Hussain S et al (2016) Learning spike time codes through morphological learning with binary synapses. IEEE Trans Neural Netw Learn Syst 27(7):1572–1577. https://doi.org/10.1109/TNNLS.2015.2447011

    Article  MathSciNet  Google Scholar 

  60. Schrauwen B, Verstraeten D, Van Campenhout J (2007) An overview of reservoir computing: theory, applications and implementations. In: Proceedings of the 15th European symposium on artificial neural networks (April):471–82. 1854/11063

  61. Shouraki SB, Honda N (1997) A new method for establishing and saving fuzzy membership function. In: 13th Symposium of FUZZY systems. pp 91–94

  62. Shouraki SB, Honda N (1999) Recursive fuzzy modeling based on fuzzy interpolation. JACIII 3:114–125

    Article  Google Scholar 

  63. Sillin HO, Aguilera R, Shieh HH et al (2013) A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24(38):384,004. https://doi.org/10.1088/0957-4484/24/38/384004

    Article  Google Scholar 

  64. Song S, Miller KD, Abbott LF (2000) Competitive hebbian learning through spike-timing-dependent synaptic plasticity. Ann Rev Neurosci 3:919–926. https://doi.org/10.1038/78829

    Article  Google Scholar 

  65. Sporea I, Grüning A (2013) Supervised learning in multilayer spiking neural networks. Neural Comput 25(2):473–509. https://doi.org/10.1162/NECO_a_00396

    Article  MathSciNet  MATH  Google Scholar 

  66. Stein RB (1965) A theoretical analysis of neuronal variability. Biophys J 5:173–194

    Article  Google Scholar 

  67. Stromatias E, Soto M, Serrano-Gotarredona T et al (2017) An event-driven classifier for spiking neural networks fed with synthetic or dynamic vision sensor data. Front Neurosci 11:350. https://doi.org/10.3389/fnins.2017.00350

    Article  Google Scholar 

  68. Tomasiello S, Loia V, Khaliq A (2021) A granular recurrent neural network for multiple time series prediction. Neural Comput Appl 33:1–18. https://doi.org/10.1007/s00521-021-05791-4

    Article  Google Scholar 

  69. Tschida KA, Mooney R (2012) Deafening drives cell-type-specific changes to dendritic spines in a sensorimotor nucleus important to learned vocalizations. Neuron 73:1028–1039

    Article  Google Scholar 

  70. van Schaik A, Tapson J (2015) Online and adaptive pseudoinverse solutions for elm weights. Neurocomputing 149:233–238. https://doi.org/10.1016/j.neucom.2014.01.071

    Article  Google Scholar 

  71. Vlachas P, Pathak J, Hunt B et al (2020) Backpropagation algorithms and reservoir computing in recurrent neural networks for the forecasting of complex spatiotemporal dynamics. Neural Netw 126:191–217. https://doi.org/10.1016/j.neunet.2020.02.016

    Article  Google Scholar 

  72. Wang W, Zhu JZ, Chang KT et al (2012) Dscr1 interacts with fmrp and is required for spine morphogenesis and local protein synthesis. EMBO J 31(18):3655–3666. https://doi.org/10.1038/emboj.2012.190

    Article  Google Scholar 

  73. Wojcik GM (2012) Electrical parameters influence on the dynamics of the hodgkin-huxley liquid state machine. Neurocomput 79:68–74. https://doi.org/10.1016/j.neucom.2011.10.007

    Article  Google Scholar 

  74. Wu Y, Deng L, Li G et al (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331. https://doi.org/10.3389/fnins.2018.00331

    Article  Google Scholar 

  75. Xue F, Hou Z, Li X (2013) Computational capability of liquid state machines with spike-timing-dependent plasticity. Neurocomputing 122:324–329

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ensieh Iranmehr.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interest exists between authors in the publication of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Iranmehr, E., Shouraki, S.B. & Faraji, M. Developing a structural-based local learning rule for classification tasks using ionic liquid space-based reservoir. Neural Comput & Applic 34, 15075–15093 (2022). https://doi.org/10.1007/s00521-022-07345-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07345-8

Keywords

Navigation