skip to main content
10.1145/3477314.3507085acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
research-article

Correlation-based regularization for fast and energy-efficient spiking neural networks

Published: 06 May 2022 Publication History

Abstract

A spiking neural network (SNN) inherently requires multiple time steps to produce a stable output. In addition, the energy consumption in a neuromorphic hardware, on which a trained SNN is deployed and executed, is proportional to the number of generated spikes in SNN execution. Therefore, it is crucial to reduce the number of time steps and the number of generated spikes as many as possible for fast and energy-efficient execution. This paper proposes a correlation-based regularizer, which is incorporated into a loss function, to address the above-mentioned issues. It aims to minimize the redundancies between the features at each layer for structural sparsity that is beneficial to energy-efficient and in the end fast execution. We evaluated the proposed regularizer over two different kinds of models - a vanilla model and one with a rate-based regularizer that is widely used for energy-efficient execution and observed that it generated a smaller number of spikes while holding down the accuracy degradation compared to existing approaches.

References

[1]
S. M. Bohte, J. N. Kok, and J. A. La Poutré. 2000. Spatio-temporal backpropagation for training high-performance spiking neural networks. ESANN 48 (2000), 419--424.
[2]
Y. Cao, Y. Chen, and D. Khosla. 2015. Spiking deep convolutional neural networks for energy-efficient object recognition. International Journal of Computer Vision 113, 1 (2015), 54--66.
[3]
L. Deng, Y. Wu, Y. Hu, L. Liang, G. Li, Y. Ding, P. Li, and Y. Xie. 2020. Comprehensive snn compression using admm optimization and activity regularization. (2020). arXiv:1911.00822
[4]
P. U. Diehl, D. Neil, J. Binas, M. Cook, S. C. Liu, and M. Pfeiffer. 2015. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In 2015 International joint conference on neural networks (IJCNN). IEEE, 1--8.
[5]
C. Eliasmith and C. H. Anderson. 2003. Neural engineering: Computation, representation, and dynamics in neurobiological systems. MIT press.
[6]
E. Hunsberger and C. Eliasmith. 2015. Spiking deep networks with LIF neurons. (2015). arXiv:1510.08829
[7]
D. P. Kingma and J. Ba. 2014. Adam: A method for stochastic optimization. (2014). arXiv:1412.6980
[8]
Y. LeCun. 1998. The MNIST database of handwritten digits. http://yann.lecun.com/exdb/mnist/
[9]
C. Lee, S. S. Sarwar, and K. Roy. 2019. Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures. (2019). arXiv:1903.06379
[10]
F. Ponulak. 2005. ReSuMe-new supervised learning method for spiking neural networks. Poznoń University of Technology, Tech. rep. Institute of Control and Information Engineering.
[11]
N. Rathi and K. Roy. 2020. DIET-SNN: Direct input encoding with leakage and threshold optimization in deep spiking neural networks. (2020). arXiv:2008.03658
[12]
N. Rathi, G. Srinivasan, P. Panda, and K. Roy. 2020. Enabling deep spiking neural networks with hybrid conversion and spike timing dependent backpropagation. (2020). arXiv:2005.01807
[13]
B. Rueckauer, I. A. Lungu, and Y. Hu. 2017. Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Frontiers in neuroscience 11, 682 (2017).
[14]
A. Sengupta, Y. Ye, R. Wang, C. Liu, and K. Roy. 2019. Going deeper in spiking neural networks: VGG and residual architectures. Frontiers in neuroscience 13, 95 (2019).
[15]
S. Takuya, R. Zhang, and Y. Nakashima. 2021. Training low-latency spiking neural network through knowledge distillation. In 2021 IEEE Symposium in Low-Power and High-Speed Chips (COOL CHIPS). IEEE.
[16]
A. Tavanaei and A. Maida. 2019. BP-STDP: Approximating backpropagation using spike timing dependent plasticity. Neurocomputing 330 (2019), 39--47.
[17]
Y. Wu, L. Deng, G. Li, J. Zhu, and L. Shi. 2018. Spatio-temporal backpropagation for training high-performance spiking neural networks. Frontiers in neuroscience 12, 331 (2018).
[18]
H. Xiao, K. Rasul, and R. Vollgraf. 2017. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms. (2017). arXiv:1708.07747

Cited By

View all
  • (2023)Direct learning-based deep spiking neural networks: a reviewFrontiers in Neuroscience10.3389/fnins.2023.120979517Online publication date: 16-Jun-2023

Index Terms

  1. Correlation-based regularization for fast and energy-efficient spiking neural networks

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    SAC '22: Proceedings of the 37th ACM/SIGAPP Symposium on Applied Computing
    April 2022
    2099 pages
    ISBN:9781450387132
    DOI:10.1145/3477314
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 May 2022

    Check for updates

    Author Tags

    1. SNN training
    2. energy consumption
    3. regularization
    4. spiking neural networks

    Qualifiers

    • Research-article

    Funding Sources

    • Institute for information & communications Technology Promotion (IITP) by the Korea government (MSIT)

    Conference

    SAC '22
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 1,650 of 6,669 submissions, 25%

    Upcoming Conference

    SAC '25
    The 40th ACM/SIGAPP Symposium on Applied Computing
    March 31 - April 4, 2025
    Catania , Italy

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)27
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 03 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2023)Direct learning-based deep spiking neural networks: a reviewFrontiers in Neuroscience10.3389/fnins.2023.120979517Online publication date: 16-Jun-2023

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media