Skip to main content

Learning High-Performance Spiking Neural Networks with Multi-Compartment Spiking Neurons

  • Conference paper
  • First Online:
Image and Graphics (ICIG 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14356))

Included in the following conference series:

  • 695 Accesses

Abstract

In recent years, spiking neural networks (SNNs) have gained significant attention due to bio-inspired working mechanism. The VGG-like and ResNet-like architectures are widely used for SNNs modeling. However, the spiking features of such architectures are generated by the layer-wise Integrate-and-Fire (IF) dynamics. The firing of deep neurons is independent of the neurons in the shallow layers. In this paper, we propose Multi-Compartment Spiking Neural Network (MC-SNN) that integrates the MPs of the shallow neurons to fire the deep neurons. Specifically, the MC-SNN is modeled by a Multi-Compartment Leaky Integrate-and-Fire (MC-LIF) neuron and an Adaptive Gating Unit (AGU). The MC-LIF neuron models the IF dynamics using the MPs from both deep and shallow layers, and the AGU adaptively scales the MPs of the MC-LIF neuron. These increase the information interaction between spiking neurons and improve the performance of SNNs. Besides, we design the Binarized Synaptic Encoder (BSE) to reduce the computation cost for the input of SNNs. Experimental results show that the MC-SNN performs well on the neuromorphic datasets, reaching 79.52% and 81.24% on CIFAR10-DVS and N-Caltech101, respectively.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Yang, L., et al.: SimAM: a simple, parameter-free attention module for convolutional neural networks. In: International Conference on Machine Learning, PMLR, pp. 11863–11874 (2021)

    Google Scholar 

  2. Chen, Z., Zhou, H., Lai, J., et al.: Contour-aware loss: boundary-aware learning for salient object segmentation. IEEE Trans. Image Process. 30, 431–443 (2020)

    Article  Google Scholar 

  3. Zhang, Q., et al.: Uncertainty modeling with second-order transformer for group re-identification. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, no. 3, pp. 3318–3325 (2022)

    Google Scholar 

  4. Sandler, M., Howard, A., Zhu, M., et al.: MobileNetV2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4510–4520 (2018)

    Google Scholar 

  5. Maass, W.: Networks of spiking neurons: the third generation of neural network models. Neural Netw. 10(9), 1659–1671 (1997)

    Article  Google Scholar 

  6. He, W., et al.: Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences. Neural Netw. 132, 108–120 (2020)

    Google Scholar 

  7. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, p. 2. Plasticity, Populations (2002)

    Google Scholar 

  8. Zenke, F., Vogels, T.P.: The remarkable robustness of surrogate gradient learning for instilling complex function in spiking neural networks. Neural Comput. 33(4), 899–925 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  9. Rueckauer, B., Lungu, I.A., Hu, Y.H., et al.: Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11, 682 (2017). https://doi.org/10.3389/fnins.2017.00682

    Article  Google Scholar 

  10. Hu, Y., Tang, H., Pan, G.: Spiking deep residual networks. IEEE Transactions on Neural Networks and Learning Systems (2021)

    Google Scholar 

  11. Yao, X., et al.: GLIF: a unified gated leaky integrate-and-fire neuron for spiking neural networks. arXiv preprint arXiv:2210.13768 (2022)

  12. Wu, Z., Zhang, H., Lin, Y., et al.: LIAF-Net: leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing. IEEE Trans. Neural Netw. Learn. Syst. 33(11), 6249–6262 (2021)

    Article  Google Scholar 

  13. Fang, W., Yu, Z., Chen, Y., et al.: Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 2661–2671 (2021)

    Google Scholar 

  14. Han, B., Srinivasan, G., Roy, K.: RMP-SNN: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 13558–13567 (2020)

    Google Scholar 

  15. Arriandiaga, A., Portillo, E., Espinosa-Ramos, J.I., et al.: Pulsewidth modulation-based algorithm for spike phase encoding and decoding of time-dependent analog data. IEEE Trans. Neural Netw. Learn. Syst. 31(10), 3920–3931 (2019)

    Article  Google Scholar 

  16. Garg, I., Chowdhury, S.S., Roy, K.: DCT-SNN: using DCT to distribute spatial information over time for low-latency spiking neural networks. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 4671–4680 (2021)

    Google Scholar 

  17. Rathi, N., Roy, K.: DIET-SNN: a low-latency spiking neural network with direct input encoding and leakage and threshold optimization. IEEE Trans. Neural Netw. Learn. Syst. (2021)

    Google Scholar 

  18. Guerguiev, J., Lillicrap, T.P., Richards, B.A.: Towards deep learning with segregated dendrites eLife 6, e22901 (2017). https://doi.org/10.7554/eLife.22901

  19. Sun, Y., Zeng, Y., Zhao, F., et al.: Multi-compartment neuron and population encoding improved spiking neural network for deep distributional reinforcement learning. arXiv preprint arXiv:2301.07275 (2023)

  20. Gao, T., Deng, B., Wang, J., et al.: Highly efficient neuromorphic learning system of spiking neural network with multi-compartment leaky integrate-and-fire neurons. Front. Neurosci. 16, 929644 (2022)

    Article  Google Scholar 

  21. Li, H., et al.: CIFAR10-DVS: an event-stream dataset for object classification. Front. Neurosci. 11, 309–309 (2017)

    Article  Google Scholar 

  22. Orchard, G., et al.: Converting static image datasets to spiking neuromorphic datasets using saccades. Front. Neurosci. 9, 437–437 (2015)

    Article  Google Scholar 

  23. Wu, J., Chua, Y., Zhang, M., Li, G., Li, H., Tan, K.C.: A tandem learning rule for effective training and rapid inference of deep spiking neural networks. TNNLS (2021). 2, 3, 7

    Google Scholar 

  24. Meng, Q., Xiao, M., Yan, S., et al.: Training high-performance low-latency spiking neural networks by differentiation on spike representation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 12444–12453 (2022)

    Google Scholar 

  25. Zheng, H., Wu, Y., Deng, L., Hu, Y., Li, G.: Going deeper with directly-trained larger spiking neural networks. In: AAAI (2021). 1, 2, 4, 7

    Google Scholar 

  26. Deng, S., et al.: Temporal efficient training of spiking neural network via gradient re-weighting. arXiv preprint arXiv:2202.11946 (2022)

  27. Zhou, Z., et al.: Spikformer: when Spiking Neural Network Meets Transformer. arXiv preprint arXiv:2209.15425 (2022)

  28. Wang, S., Cheng, T.H., Lim, M.H.: LTMD: learning improvement of spiking neural networks with learnable thresholding neurons and moderate dropout. Adv. Neural. Inf. Process. Syst. 35, 28350–28362 (2022)

    Google Scholar 

  29. Guo, Y., Chen, Y., Zhang, L., et al.: IM-Loss: information maximization loss for spiking neural networks. Adv. Neural. Inf. Process. Syst. 35, 156–166 (2022)

    Google Scholar 

  30. Shymyrbay, A., Fouda, M.E., Eltawil, A.: Training-aware low precision quantization in spiking neural networks. In: 2022 56th Asilomar Conference on Signals, Systems, and Computers, pp. 1147–1151. IEEE (2022)

    Google Scholar 

  31. Zhu, R.J., Zhao, Q., Zhang, T., et al.: TCJA-SNN: temporal-channel joint attention for spiking neural networks. arXiv preprint arXiv:2206.10177 (2022)

  32. Wu, D., et al.: Optimising event-driven spiking neural network with regularisation and cutoff. arXiv preprint arXiv:2301.09522 (2023)

  33. Shen, G., Zhao, D., Zeng, Y.: EventMix: an efficient augmentation strategy for event-based data. arXiv preprint arXiv:2205.12054 (2022)

  34. Li, Y., et al. Neuromorphic data augmentation for training spiking neural networks. In: Computer Vision–ECCV 2022: 17th European Conference, Tel Aviv, Israel, October 23–27, 2022, Proceedings, Part VII, pp. 631-649. Springer Nature Switzerland, Cham (2022). https://doi.org/10.1007/978-3-031-20071-7_37

Download references

Acknowledgement

This work was supported in part by the Key-Area Research and Development Program of Guangzhou (202007030004); in part by the National Natural Science Foundation of China(62076258).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianhuang Lai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, X., Tang, J., Lai, J. (2023). Learning High-Performance Spiking Neural Networks with Multi-Compartment Spiking Neurons. In: Lu, H., et al. Image and Graphics. ICIG 2023. Lecture Notes in Computer Science, vol 14356. Springer, Cham. https://doi.org/10.1007/978-3-031-46308-2_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46308-2_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46307-5

  • Online ISBN: 978-3-031-46308-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics