Skip to main content
Log in

High-Accuracy Spiking Neural Network for Objective Recognition Based on Proportional Attenuating Neuron

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

Spiking neural network (SNN) is one of the most successful methods to imitate biological brain behavior and learning potential. The information processing mechanism of SNN combines the concepts of time and space. To address the performance dropping problem during the conversion process from the artificial neural network (ANN) to SNN, this paper proposes a proportional attenuation leaky integrate-and-fire (RA-LIF) neuron model to solve the problem of membrane potential loss caused by leaky integrate-and-fire (LIF) neurons. The accuracy of the SNN network based on RA-LIF neurons on MNIST is 98.76%. This paper also proposes a weight normalization method to help adjust the network spiking rate to reduce the loss. We evaluate and analyze the performance of SNN networks based on LIF, RA-LIF, and AD-LIF. By analyzing the spike firing rate and convergence rate, the effects of spike frequency and neuron threshold on the performance of the network are discussed. So far, the SNN transformed by ANN can only achieve worse or similar performance compared with the original network. For the first time, the performance of the SNN transformed by the proposed RA-LIF neuron model and weight normalization method is better than the original network, which shows the test accuracy of 98.88% on MNIST. The SNN obtained by converting the LeNet network using the above method achieves a test accuracy of 98.91% on MNIST. In conclusion, the RA-LIF neuron model and normalization method proposed in this paper have promising applicability. Moreover, this method has the characteristics of high precision and fast calculation speed, which can provide a reference for the research of the SNN framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Tavanaei A, Ghodrati M, Kheradpisheh SR et al (2019) Deep learning in spiking neural networks. Neural Netw 111:47–63

    Article  Google Scholar 

  2. Butts DA, Weng C, Jin J et al (2007) Temporal precision in the neural code and the timescales of natural vision. Nature 449(7158):92–95

    Article  Google Scholar 

  3. Lin XH, Wang XW, Hao ZJ (2017) Supervised learning in multilayer spiking neural networks with inner products of spike trains. Neurocomputing 237:59–70

    Article  Google Scholar 

  4. Zhao B, Ding R, Chen S et al (2014) Feedforward categorization on AER motion events using cortex-like features in a spiking neural network. IEEE Trans Neural Netw learn Syst 26(9):1963–1978

    Article  MathSciNet  Google Scholar 

  5. Mozafari M, Ganjtabesh M, Nowzari-Dalini A et al (2019) Bio-inspired digit recognition using reward-modulated spike-timing-dependent plasticity in deep convolutional networks. Pattern Recogn 94:87–95

    Article  Google Scholar 

  6. Lee C, Panda P, Srinivasan G et al (2018) Training deep spiking convolutional neural networks with stdp-based unsupervised pre-training followed by supervised fine-tuning. Front Neurosci 12:435

    Article  Google Scholar 

  7. Li H, Li G, Ji X et al (2018) Deep representation via convolutional neural network for classification of spatiotemporal event streams. Neurocomputing 299:1–9

    Article  Google Scholar 

  8. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508

    Google Scholar 

  9. Querlioz D, Bichler O, Dollfus P et al (2013) Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans Nanotechnol 12(3):288–295

    Article  Google Scholar 

  10. Neil D, Pfeiffer M, Liu S (2016) Learning to be efficient: algorithms for training low-latency, low-compute deep spiking neural networks. In: ACM Symposium on Applied Computing (SAC). Pisa: ACM, 293~298

  11. Xu Q, Qi Y, Yu H, et al (2018) CSNN: An Augmented Spiking based Framework with Perceptron-Inception. In: International Joint Conference on Artificial Intelligence. Stockholm: Morgan Kaufmann, 1646~1652

  12. Cao Y, Chen Y, Khosla D (2015) Spiking deep convolutional neural networks for energy-efficient object recognition. Int J Comput Vision 113(1):54–66

    Article  MathSciNet  Google Scholar 

  13. Diehl PU, Neil D, Binas J et al (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: International Joint Conference on Neural Networks. Killarney: IEEE, 1~8

  14. Danelljan M, Shahbaz Khan F, Felsberg M, et al (2014) Adaptive color attributes for real-time visual tracking. In: Computer Vision and Pattern Recognition. Columbus: IEEE, 1090~1097.

  15. Ma J, Shao W, Ye H et al (2018) Arbitrary-oriented scene text detection via rotation proposals. IEEE Trans Multimedia 20(11):3111–3122

    Article  Google Scholar 

  16. Qiang FA, Hd A (2021) An ensemble unsupervised spiking neural network for objective recognition- ScienceDirect. Neurocomputing 419:47–58

    Article  Google Scholar 

  17. Song Z, Wu X, Yuan M et al (2019) An unsupervised spiking deep neural network for object recognition. Springer, Cham

    Book  Google Scholar 

  18. Hamidinekoo A, Denton E, Rampun A, et al (2018) Deep learning in mammography and breast histology, an overview and future trends. Medical Image Analysis, 45–67

  19. Urbanczik R, Senn W (2014) Learning by the dendritic prediction of somatic spiking. Neuron 81:521–528

    Article  Google Scholar 

  20. Burkitt AN (2006) A Review of the integrate-and-fire neuron model: I Homogeneous synaptic input. Biol Cybern 95(1):1–19

    Article  MathSciNet  Google Scholar 

  21. Markram H, Gerstner W (2012) Spike-timing-dependent plasticity: a comprehensive overview. Front Synaptic Neurosci 4(2):2

    Google Scholar 

  22. Bienenstock EL, Cooper LN, Munro PW (1982) Theory for the development of neuron selectivity: Orientation specificity and binocular interaction in visual cortex[J]. J Neurosci 2(1):32–48

    Article  Google Scholar 

  23. Mishra D, Yadav A, Kalra PK (2006) A learning algorithm for a novel neural network architecture motivated by Integrate-and-Fire Neuron Model. Neural Network World 16(6):513–532

    Google Scholar 

  24. Aniello B, Luigia C, Enrica P, Maria FC (2017) A leaky integrate-and-fire model with adaptation for the generation of a spike train. Math Biosci Eng 13(3):483–493

    MathSciNet  MATH  Google Scholar 

  25. Brette R (2005) Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. J Neurophysiol 94(5):3637

    Article  Google Scholar 

  26. Jang V (1992) Inversion of Fredholm integral equations of the first kind with fully connected neural networks. J Franklin Inst 329(2):241–257

    Article  MathSciNet  Google Scholar 

  27. Yang X, Zhang Z, Zhu W et al (2020) Deterministic conversion rule for CNNs to efficient spiking convolutional neural networks. Science China Inf Sci 63(2):1–19

    MathSciNet  Google Scholar 

  28. Palm R B (2012) Prediction as a Candidate for Learning Deep Hierarchical Models of Data. Technical University of Denmark

  29. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Frontiers in Neuroscience, 10

  30. Peter, U, Diehl, et al (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99

  31. Cohen G K, Garrick O, Leng SH, et al (2016) Skimming Digits: Neuromorphic Classification of Spike-Encoded Images. Frontiers in Neuroscience, 10(184).

  32. Neftci E, Das S, Pedroni B et al (2013) Event-driven contrastive divergence for spiking neuromorphic systems. Front Neurosci 7(8):272

    Google Scholar 

Download references

Acknowledgements

This work is supported by the National Nature Science Foundation of China (Grant No. 61871106 and No. 61370152), Key R & D projects of Liaoning Province, China (Grant No. 2020JH2/10100029), and the Open Project Program Foundation of the Key Laboratory of Opto-Electronics Information Processing, Chinese Academy of Sciences (OEIP-O-202002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ying Wei.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Niu, LY., Wei, Y., Long, JY. et al. High-Accuracy Spiking Neural Network for Objective Recognition Based on Proportional Attenuating Neuron. Neural Process Lett 54, 1055–1073 (2022). https://doi.org/10.1007/s11063-021-10669-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-021-10669-6

Keywords

Navigation