Skip to main content

Advertisement

Log in

Quantized STDP-based online-learning spiking neural network

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this work, we report a spike-timing-dependent plasticity (STDP)-based weight-quantized/binarized online-learning spiking neural network (SNN). The SNN uses bio-plausible integrate-and-fire (IF) neuron and conductance-based synapse as the basic building blocks and realizes online learning by STDP and winner-take-all (WTA) mechanism. Weight quantization/binarization is introduced into the online-learning SNN to reduce storage requirements and improve computing efficiency. After the training process with STDP and weight quantization on the MNIST training set, the quantized SNN with 4-bit weight achieves a recognition accuracy of 93.8% on the MNIST test set, showing little loss compared with the accuracy of the non-quantized 32-bit SNN (94.1%). The accuracy of the binarized SNN slightly decreases to 92.9%, which is cost-effective considering the reduction in the weight storage space by ~ 32 times, and the product of input and weight in the binarized SNN can be realized by computationally cheap 1-bit “AND” operation. The proposed weight quantization/binarization online-learning scheme can largely save hardware costs. The area of the quantized (8-bit and 4-bit) and binarized (1-bit) SNN-based hardware is evaluated to be 448,524, 179,263, and 162,129 μm2, respectively, which is much smaller than their non-quantized 32-bit competitor (area of ~ 5.862 × 108 μm2). The hardware resource evaluation also provides a guide to make a trade-off between computational cost and performance. Moreover, the quantized/binarized STDP training method can be further extended to train various types of SNNs. In this regard, a hybrid STDP SNN and a hybrid STDP convolutional SNN, which are trained by combining unsupervised quantized/binarized STDP and supervised backpropagation (BP) training methods, achieve high accuracy in facial expression recognition scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Krizhevsky A, Sutskever I, Hinton GE (2017) Imagenet classification with deep convolutional neural networks. Commun ACM 60(6):84–90

    Article  Google Scholar 

  2. Long J, Shelhamer E, Darrell T (2015) Fully convolutional networks for semantic segmentation. IEEE Trans Pattern Anal Mach Intell 39(4):640–651

    Google Scholar 

  3. Bellec G, Salaj D, Subramoney A, Legenstein R, Maass W (2018) Long short-term memory and learning-to-learn in networks of spiking neurons. Adv Neural Inform Process Syst 2018:787–797

    Google Scholar 

  4. Hinton G, Deng L, Yu D, Dahl GE, Kingsbury B (2012) Deep Neural networks for acoustic modeling in speech recognition: The shared views of four research groups. IEEE Signal Process Mag 29(6):82–97

    Article  Google Scholar 

  5. Neil D, Pfeiffer M, Liu S-C (2016) Phased lstm: accelerating recurrent network training for long or event-based sequences. Adv Neural Inform Process Syst 2016:3882–3890

    Google Scholar 

  6. Zhang X, Zhao J, LeCun Y (2015) Character-level convolutional networks for text classification. Adv Neural Inform Process Syst 2015:649–657

    Google Scholar 

  7. Roy K, Jaiswal A, Panda PJN (2019) Towards spike-based machine intelligence with neuromorphic computing. Nature 575(7784):607–617

    Article  Google Scholar 

  8. Hayati M, Nouri M, Haghiri S, Abbott D (2015) Digital multiplierless realization of two coupled biological Morris-Lecar neuron model. IEEE Trans Circuits Syst I Regul Pap 62(7):1805–1814

    Article  Google Scholar 

  9. Pei J et al (2019) Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572(7767):106

    Article  Google Scholar 

  10. Bi G-Q, Poo M-M (1998) Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J Neurosci 18(24):10464–10472

    Article  Google Scholar 

  11. Merolla PA et al (2014) Artificial brains. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197):668–673

    Article  Google Scholar 

  12. Schuman CD et al (2017) A survey of neuromorphic computing and neural networks in hardware. arXiv preprint arXiv;1705.06963

  13. Akopyan F et al (2015) TrueNorth: design and tool flow of a 65 mW 1 million neuron programmable neurosynaptic chip. IEEE Trans Comput Aided Des Integr Circuits Syst 34(10):1537–1557

    Article  Google Scholar 

  14. Davies M et al (2018) Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 99:1–1

    Google Scholar 

  15. Rastegari M, Ordonez V, Redmon J, Farhadi A (2016) Xnor-net: Imagenet classification using binary convolutional neural networks. European conference on computer vision. Springer, pp 525–542

    Google Scholar 

  16. Wang Z, Liu K, Cui X, Wang Y (2020) Deep spiking binary neural network for digital neuromorphic hardware. In: 2020 IEEE 15th international conference on solid-state & integrated circuit technology (ICSICT). IEEE, New York, pp 1–3

  17. Sorbaro M, Liu Q, Bortone M, Sheik S (2020) Optimizing the energy consumption of spiking neural networks for neuromorphic applications. Front Neurosci 14:662

    Article  Google Scholar 

  18. Courbariaux M, Bengio Y, David J-P (2015) Binaryconnect: training deep neural networks with binary weights during propagations. In: Advances in neural information processing systems, 2015, pp 3123–3131

  19. Li F, Zhang B, Liu B (2016) Ternary weight networks. arXiv preprint arXiv;1605.04711

  20. Hubara I, Courbariaux M, Soudry D, Ran EY, Bengio Y (2016) Quantized neural networks: training neural networks with low precision weights and activations. J Mach Learn Res 18:1

    MathSciNet  MATH  Google Scholar 

  21. Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A (2018) Deep learning in spiking neural networks. Neural Netw

  22. Neftci EO, Mostafa H, Zenke F (2019) Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process Mag 36(6):51–63

    Article  Google Scholar 

  23. Lobo JL, Del Ser J, Bifet A, Kasabov N (2020) Spiking neural networks and online learning: an overview and perspectives. Neural Netw 121:88–100

    Article  Google Scholar 

  24. Abbott LF (1999) Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res Bull 50(5–6):303–304

    Article  Google Scholar 

  25. Delorme A, Gautrais J, Van Rullen R, Thorpe S (1999) SpikeNET: a simulator for modeling large networks of integrate and fire neurons. Neurocomputing 26:989–996

    Article  Google Scholar 

  26. Diehl PU, Neil D, Binas J, Cook M, Liu S-C, Pfeiffer M (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 international joint conference on neural networks (IJCNN), 2015. IEEE, pp 1–8

  27. Pehlevan C, Chklovskii DB (2019) Neuroscience-inspired online unsupervised learning algorithms: artificial neural networks. IEEE Signal Process Mag 36(6):88–96

    Article  Google Scholar 

  28. Hebb DO (2013) The organization of behavior A neuropsychological theory. Chapman & Hall, John Wiley

    Google Scholar 

  29. Sejnowski TJ (1977) Storing covariance with nonlinearly interacting neurons. J Math Biol 4(4):303–321

    Article  MathSciNet  Google Scholar 

  30. Oja E (1982) Simplified neuron model as a principal component analyzer. J Math Biol 15(3):267–273

    Article  MathSciNet  Google Scholar 

  31. Bienenstock EL, Cooper LN, Munro PW (1982) Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J Neurosci 2(1):32–48

    Article  Google Scholar 

  32. Caporale N, Dan Y (2008) Spike timing-dependent plasticity: a Hebbian learning rule. Annual Rev Neuroence 31(1):25–46

    Article  Google Scholar 

  33. E C et al (2012) A large-scale model of the functioning brain. Science 338(6111): 1202

  34. Diehl PU, Matthew C (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neuroence 9(429):99

    Google Scholar 

  35. Brader JM, Senn W, Fusi S (2007) Learning real-world stimuli in a neural network with spike-driven synaptic dynamics. Neural Comput 19(11):2881–2912

    Article  MathSciNet  Google Scholar 

  36. Tavanaei A, Maida A (2019) BP-STDP: approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330:39–47

    Article  Google Scholar 

  37. Querlioz D, Bichler O, Dollfus P, Gamrat C (2013) Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans Nanotechnol 12(3):288–295

    Article  Google Scholar 

  38. Capizzi G, Sciuto GL, Napoli C, Woźniak M, Susi G (2020) A spiking neural network-based long-term prediction system for biogas production. Neural Netw

  39. Toğaçar M, Ergen B, Cömert Z (2020) Detection of weather images by using spiking neural networks of deep learning models. Neural Comput Appl 1–13

  40. Bing Z, Meschede C, Chen G, Knoll A, Huang K (2020) Indirect and direct training of spiking neural networks for end-to-end control of a lane-keeping vehicle. Neural Netw 121:21–36

    Article  Google Scholar 

  41. Tavanaei A, Maida AS (2017) Multi-layer unsupervised learning in a spiking convolutional neural network. In: International joint conference on neural networks, 2017

  42. Zhao D, Zeng Y, Zhang T, Shi M, Zhao F (2020) GLSNN: A multi-layer spiking neural network based on global feedback alignment and local STDP plasticity. Front Comput Neurosci

  43. Saunders DJ, Patel D, Hazan H, Siegelmann HT, Kozma R (2019) Locally connected spiking neural networks for unsupervised feature learning. Neural Netw 119:332–340

    Article  Google Scholar 

  44. Kasabov NK (2018) Time-space, spiking neural networks and brain-inspired artificial intelligence. Springer

    Google Scholar 

  45. Qiao GC et al (2020) STBNN: hardware-friendly spatio-temporal binary neural network with high pattern recognition accuracy. Neurocomputing 409:351–360. https://doi.org/10.1016/j.neucom.2020.06.084

    Article  Google Scholar 

  46. Hubara I, Courbariaux M, Soudry D, El-Yaniv R, Bengio Y (2017) Quantized neural networks: training neural networks with low precision weights and activations. J Mach Learn Res 18(1):6869–6898

    MathSciNet  MATH  Google Scholar 

  47. Qiao G et al (2019) A neuromorphic-hardware oriented bio-plausible online-learning spiking neural network model. IEEE Access 7:71730–71740

    Article  Google Scholar 

  48. Fang Y, Cohen MA, Kincaid TG (1996) Dynamics of a winner-take-all neural network. Neural Netw 9(7):1141–1154

    Article  Google Scholar 

  49. Gerstner W, Kistler WM (2002) Spiking neuron models: single neurons, populations, plasticity. Cambridge University Press

  50. Mensi S, Naud R, Pozzorini C, Avermann M, Petersen CCH, Gerstner W (2012) Parameter extraction and classification of three cortical neuron types reveals two distinct adaptation mechanisms. J Neurophysiol 107(6):1756

    Article  Google Scholar 

  51. Morrison A, Aertsen A, Diesmann M (2007) Spike-timing-dependent plasticity in balanced random networks. Neural Comput 19(6):1437–1467

    Article  MathSciNet  Google Scholar 

  52. Goodman DF, Brette R (2008) Brian: a simulator for spiking neural networks in python. Front Neuroinform 2:5

    Article  Google Scholar 

  53. Lyons MJ, Akamatsu S, Kamachi MG, Gyoba J (1998) Coding facial expressions with Gabor wavelets. In: Proceedings third IEEE international conference on automatic face and gesture recognition, 1998, 200–205.

  54. Lucey P, Cohn JF, Kanade T, Saragih J, Ambadar Z, Matthews I (2010) The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE computer society conference on computer vision and pattern recognition-workshops, 2010. IEEE, pp 94–101

Download references

Acknowledgements

This work was supported in part by NSFC under Projects 61771097 and 61774028 and in part by the Fundamental Research Funds for the Central Universities under Project ZYGX2016Z007.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Q. Yu.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hu, S.G., Qiao, G.C., Chen, T.P. et al. Quantized STDP-based online-learning spiking neural network. Neural Comput & Applic 33, 12317–12332 (2021). https://doi.org/10.1007/s00521-021-05832-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-05832-y

Keywords

Navigation