Skip to main content

EB-SNN: An Ensemble Binary Spiking Neural Network for Visual Recognition

  • Conference paper
  • First Online:
Pattern Recognition (ICPR 2024)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 15308))

Included in the following conference series:

  • 220 Accesses

Abstract

In recent years, spiking neural networks (SNNs) have gained significant attention in visual recognition tasks due to the low computational energy. However, most SNNs have a large number of parameters, which limits their use on resource-limited devices. In this paper, we propose an Ensemble Binary Spiking Neural Network (EB-SNN) for accurate and memory-friendly visual recognition. The EB-SNN is modeled by Ensemble Binary Weights (EBW) module, which integrates multiple binary weights for lightweight SNN modeling. Meanwhile, we propose Knowledge Alignment Strategy to ensure that the EB-SNN can approximate a well-trained SNN for good performance. Experimental results show that the EB-SNN can achieve accuracy of 95.39% on CIFAR10, using \(9.3\%\) memory of full-precision SNN.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Wu, Y., Deng, L., Li, G., et al.: Spatio-temporal backpropagation for training high-performance spiking neural networks. Front. Neurosci. 12, 323875 (2018)

    Article  Google Scholar 

  2. Davies, M., Srinivasa, N., Lin, T.H., et al.: Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1), 82–99 (2018)

    Article  Google Scholar 

  3. Wang, W., Zhou, S., Li, J., et al.: Temporal pulses driven spiking neural network for time and power efficient object recognition in autonomous driving. In: 2020 25th International Conference on Pattern Recognition (ICPR), pp. 6359–6366. IEEE (2021)

    Google Scholar 

  4. Neftci, E.O., Mostafa, H., Zenke, F.: Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Mag. 36(6), 51–63 (2019)

    Article  Google Scholar 

  5. Simons, T., Lee, D.J.: A review of binarized neural networks. Electronics 8(6), 661 (2019)

    Article  Google Scholar 

  6. Wang, Y., Xu, Y., Yan, R., et al.: Deep spiking neural networks with binary weights for object recognition. IEEE Trans. Cogn. Dev. Syst. 13(3), 514–523 (2020)

    Article  Google Scholar 

  7. Jang, H., Skatchkovsky, N., Simeone, O.: BiSNN: training spiking neural networks with binary weights via bayesian learning. In: 2021 IEEE Data Science and Learning Workshop (DSLW), pp. 1–6. IEEE (2021)

    Google Scholar 

  8. Kheradpisheh, S.R., Mirsadeghi, M., Masquelier, T.: Bs4nn: binarized spiking neural networks with temporal coding and learning. Neural Process. Lett. 54(2), 1255–1273 (2022)

    Article  Google Scholar 

  9. Xu, C., Pei, Y., Wu, Z., et al.: Ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator. arXiv preprint arXiv:2208.00398 (2022)

  10. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations. Plasticity. Cambridge University Press, Cambridge (2002)

    Book  Google Scholar 

  11. Srinivasan, G., Roy, K.: Restocnet: residual stochastic binary convolutional spiking neural network for memory-efficient neuromorphic computing. Front. Neurosci. 13, 439696 (2019)

    Article  Google Scholar 

  12. Bengio, Y., Léonard, N., Courville, A.: Estimating or propagating gradients through stochastic neurons for conditional computation. arXiv preprint arXiv:1308.3432 (2013)

  13. Deng, S., Li, Y., Zhang, S., et al.: Temporal efficient training of spiking neural network via gradient re-weighting. arXiv preprint arXiv:2202.11946 (2022)

  14. Guo, Y., Zhang, Y., Chen, Y., et al.: Membrane potential batch normalization for spiking neural networks. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 19420–19430 (2023)

    Google Scholar 

  15. Fang, W., Yu, Z., Chen, Y., et al.: Incorporating learnable membrane time constant to enhance learning of spiking neural networks. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 2661–2671 (2021)

    Google Scholar 

  16. Rathi, N., Roy, K.: Diet-snn: a low-latency spiking neural network with direct input encoding and leakage and threshold optimization. IEEE Trans. Neural Netw. Learn. Syst. 34(6), 3174–3182 (2021)

    Article  Google Scholar 

  17. Guo, Y., Chen, Y., Zhang, L., et al.: IM-loss: information maximization loss for spiking neural networks. Adv. Neural. Inf. Process. Syst. 35, 156–166 (2022)

    Google Scholar 

  18. Xu, Q., Li, Y., Shen, J., et al.: Constructing deep spiking neural networks from artificial neural networks with knowledge distillation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7886–7895 (2023)

    Google Scholar 

  19. Li, Y., Guo, Y., Zhang, S., et al.: Differentiable spike: rethinking gradient-descent for training spiking neural networks. Adv. Neural. Inf. Process. Syst. 34, 23426–23439 (2021)

    Google Scholar 

  20. Meng, Q., Xiao, M., Yan, S., et al.: Towards memory-and time-efficient backpropagation for training spiking neural networks. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 6166–6176 (2023)

    Google Scholar 

  21. Zheng, H., Wu, Y., Deng, L., et al.: Going deeper with directly-trained larger spiking neural networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 12, pp. 11062–11070 (2021)

    Google Scholar 

  22. Guo, Y., Tong, X., Chen, Y., et al.: Recdis-snn: rectifying membrane potential distribution for directly training spiking neural networks. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 326–335 (2022)

    Google Scholar 

  23. Guo, Y., Zhang, L., Chen, Y., et al.: Real spike: learning real-valued spikes for spiking neural networks. In: European Conference on Computer Vision, pp. 52–68. Springer, Cham (2022)

    Google Scholar 

  24. Lian, S., Shen, J., Liu, Q., et al.: Learnable surrogate gradient for direct training spiking neural networks. In: Proceedings of the Thirty-Second International Joint Conference on Artificial Intelligence, IJCAI-23, pp. 3002–3010 (2023)

    Google Scholar 

  25. Yao, X., Li, F., Mo, Z., et al.: Glif: a unified gated leaky integrate-and-fire neuron for spiking neural networks. Adv. Neural. Inf. Process. Syst. 35, 32160–32171 (2022)

    Google Scholar 

  26. Duan, C., Ding, J., Chen, S., et al.: Temporal effective batch normalization in spiking neural networks. Adv. Neural. Inf. Process. Syst. 35, 34377–34390 (2022)

    Google Scholar 

  27. Dong, Y., Zhao, D., Zeng, Y.: Temporal knowledge sharing enable spiking neural network learning from past and future. IEEE Trans. Artif. Intell. (2024)

    Google Scholar 

  28. Dinh, V.N., Bui, N.M., Nguyen, V.T., et al.: NUTS-BSNN: a non-uniform time-step binarized spiking neural network with energy-efficient in-memory computing macro. Neurocomputing 560, 126838 (2023)

    Article  Google Scholar 

  29. Hao, Z., Ding, J., Bu, T., et al.: Bridging the gap between anns and snns by calibrating offset spikes. arXiv preprint arXiv:2302.10685 (2023)

  30. Lu, S., Sengupta, A.: Exploring the connection between binary and spiking neural networks. Front. Neurosci. 14, 540201 (2020)

    Article  Google Scholar 

Download references

Acknowledgement

This work was supported in part by the Key-Area Research and Development Program of Guangzhou (202007030004); in part by the National Natural Science Foundation of China(62076258).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianhuang Lai .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, X., Tang, J., Lai, J. (2025). EB-SNN: An Ensemble Binary Spiking Neural Network for Visual Recognition. In: Antonacopoulos, A., Chaudhuri, S., Chellappa, R., Liu, CL., Bhattacharya, S., Pal, U. (eds) Pattern Recognition. ICPR 2024. Lecture Notes in Computer Science, vol 15308. Springer, Cham. https://doi.org/10.1007/978-3-031-78186-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-78186-5_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-78185-8

  • Online ISBN: 978-3-031-78186-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics