Skip to main content

DNM-SNN: Spiking Neural Network Based on Dual Network Model

  • Conference paper
  • First Online:
Intelligence Science IV (ICIS 2022)

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 659))

Included in the following conference series:

  • 1180 Accesses

Abstract

In recent years, deep neural network (DNN) has shown excellent performance in many applications. However, the huge energy consumption leads to many problems. In order to solve this problem, spiking neural network (SNN) has attracted extensive research attention. SNN is the third-generation neural network that is used to process complex spatiotemporal data, and it has become a hot research topic due to its event-driven and low-power characteristics. However, the propagation function of spiking neurons is usually non-differentiable, which prevents back propagation and makes the training of SNN difficult. This paper proposes an efficient supervised learning algorithm framework based on dual-network-model spiking neural network (DNM-SNN), which is universal to various supervised learning algorithms of spiking neural networks and can effectively improve the prediction accuracy. DNM-SNN includes two key methods. Firstly, a dual model training method in training stage is proposed, which requires an additional auxiliary network same as the network used. Single model training is easy to fall into local optimal problems. By maintaining two networks, the same problem can be viewed from different perspectives, which solves the problem and improves the training effect. Second, we propose a multi-channel mix module inference method in the prediction stage. The prediction accuracy of the model is improved and the performance of the spiking neural network is optimized by multi-channel optimization of mix module. Experimental results show that the DNM-SNN outperforms the single-model algorithm on classification tasks, with a slight improvement on the MNIST dataset and a 3% improvement on the CIFAE-10 dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)

    Article  MATH  Google Scholar 

  2. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, vol. 25 (2012)

    Google Scholar 

  3. Kheradpisheh, S.R., Ganjtabesh, M., Thorpe, S.J., et al.: STDP-based spiking deep convolutional neural networks for object recognition. Neural Netw. 99, 56–67 (2018)

    Article  Google Scholar 

  4. Rueckauer, B., Lungu, I.A., Hu, Y., et al.: Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11, 682 (2017)

    Article  Google Scholar 

  5. Cao, Y., Chen, Y., Khosla, D.: Spiking deep convolutional neural networks for energy-efficient object recognition. Int. J. Comput. Vis. 113(1), 54–66 (2015)

    Article  MathSciNet  Google Scholar 

  6. Sengupta, A., Ye, Y., Wang, R., et al.: Going deeper in spiking neural networks: VGG and residual architectures. Front. Neurosci. 13, 95 (2019)

    Article  Google Scholar 

  7. Lin, X., Wang, X., Zhang, N., et al.: Supervised learning algorithms for spiking neural networks: a review. Acta Electonica Sinica 43(3), 577 (2015)

    Google Scholar 

  8. He, K., Zhang, X., Ren, S., et al.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  9. Hu, Y., Tang, H., Pan, G.: Spiking deep residual networks. IEEE Trans. Neural Netw. Learn. Syst. (2018)

    Google Scholar 

  10. Zambrano, D., Nusselder, R., Scholte, H.S., et al.: Sparse computation in adaptive spiking neural networks. Front. Neurosci. 12, 987 (2019)

    Article  Google Scholar 

  11. Kim, S., Park, S., Na, B., et al.: Spiking-yolo: spiking neural network for energy-efficient object detection. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, no. 07, pp. 11270–11277 (2020)

    Google Scholar 

  12. Lapique, L.: Recherches quantitatives sur l’excitation electrique des nerfs traitee comme une polarization. J. Physiol. Pathol. 9, 620–635 (1907)

    Google Scholar 

  13. Dayan, P., Abbott, L.F.: Computational and Mathematical Modeling of Neural Systems. Theoretical Neuroscience. MIT Press (2001)

    Google Scholar 

  14. Han, B., Srinivasan, G., Roy, K.: RMP-SNN: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 13558–13567 (2020)

    Google Scholar 

Download references

Acknowledgements

This work was supported in part by the China Postdoctoral Science Foundation under Grant 2019M663637, in part by the Natural Science Basic Research Program of Shaanxi under Program 2021JQ-201, and in part by the National Natural Science Foundation of China under Grant 62104176.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhen Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cao, Z., Zhang, H., Wang, Q., Ma, C. (2022). DNM-SNN: Spiking Neural Network Based on Dual Network Model. In: Shi, Z., Jin, Y., Zhang, X. (eds) Intelligence Science IV. ICIS 2022. IFIP Advances in Information and Communication Technology, vol 659. Springer, Cham. https://doi.org/10.1007/978-3-031-14903-0_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14903-0_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14902-3

  • Online ISBN: 978-3-031-14903-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics