Skip to main content

Investigating Current-Based and Gating Approaches for Accurate and Energy-Efficient Spiking Recurrent Neural Networks

  • Conference paper
  • First Online:
Artificial Neural Networks and Machine Learning – ICANN 2022 (ICANN 2022)

Abstract

Spiking Neural Networks (SNNs) with spike-based computations and communications may be more energy-efficient than Artificial Neural Networks (ANNs) for embedded applications. However, SNNs have mostly been applied to image processing, although audio applications may better fit their temporal dynamics. We evaluate the accuracy and energy-efficiency of Leaky Integrate-and-Fire (LIF) models on spiking audio datasets compared to ANNs. We demonstrate that, for processing temporal sequences, the Current-based LIF (Cuba-LIF) outperforms the LIF. Moreover, gated recurrent networks have demonstrated superior accuracy than simple recurrent networks for such tasks. Therefore, we introduce SpikGRU, a gated version of the Cuba-LIF. SpikGRU achieves higher accuracy than other recurrent SNNs on the most difficult task studied in this work. The Cuba-LIF and SpikGRU reach state-of-the-art accuracy, only <1.1% below the accuracy of the best ANNs, while showing up to a 49x reduction in the number of operations compared to ANNs, due to the high spike sparsity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Abbott, L.: Lapicque’s introduction of the integrate-and-fire model neuron (1907). Brain Res. Bullet. 50(5), 303–304 (1999). https://doi.org/10.1016/S0361-9230(99)00161-6

  2. Anumula, J., Neil, D., Delbruck, T., Liu, S.C.: Feature representations for neuromorphic audio spike streams. Front. Neurosci. 12 (2018). https://doi.org/10.3389/fnins.2018.00023

  3. Bellec, G., Salaj, D., Subramoney, A., Legenstein, R.A., Maass, W.: Long short-term memory and learning-to-learn in networks of spiking neurons. In: Advances in Neural Information Processing Systems: NeurIPS, pp. 795–805 (2018)

    Google Scholar 

  4. Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994). https://doi.org/10.1109/72.279181

    Article  Google Scholar 

  5. Cho, K., et al.: Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1724–1734. Association for Computational Linguistics, Doha (2014). https://doi.org/10.3115/v1/D14-1179

  6. Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg spiking data sets for the systematic evaluation of spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 1–14 (2020). https://doi.org/10.1109/TNNLS.2020.3044364

  7. Davies, M., et al.: Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38(1), 82–99 (2018). https://doi.org/10.1109/MM.2018.112130359

    Article  Google Scholar 

  8. Han, B., Srinivasan, G., Roy, K.: RMP-SNN: residual membrane potential neuron for enabling deeper high-accuracy and low-latency spiking neural network. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 13555–13564 (2020). https://doi.org/10.1109/CVPR42600.2020.01357

  9. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997). https://doi.org/10.1162/neco.1997.9.8.1735

    Article  Google Scholar 

  10. Horowitz, M.: Computing’s energy problem (and what we can do about it). In: 2014 IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC), pp. 10–14 (2014). https://doi.org/10.1109/ISSCC.2014.6757323

  11. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arxiv:1412.6980 (2014)

  12. Le, Q.V., Jaitly, N., Hinton, G.E.: A simple way to initialize recurrent networks of rectified linear units. arXiv preprint arXiv:1504.00941 (2015)

  13. Lichtsteiner, P., Posch, C., Delbruck, T.: A 128\(\times \) 128 120 db 15 \(\mu \)s latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43(2), 566–576 (2008). https://doi.org/10.1109/JSSC.2007.914337

    Article  Google Scholar 

  14. Lotfi Rezaabad, A., Vishwanath, S.: Long short-term memory spiking networks and their applications. In: International Conference on Neuromorphic Systems 2020, pp. 1–9. ACM (2020). https://doi.org/10.1145/3407197.3407211

  15. Neftci, E., Mostafa, H., Zenke, F.: Surrogate gradient learning in spiking neural networks: bringing the power of gradient-based optimization to spiking neural networks. IEEE Signal Process. Magaz. 36, 51–63 (2019). https://doi.org/10.1109/MSP.2019.2931595

    Article  Google Scholar 

  16. Pan, Z., Chua, Y., Wu, J., Zhang, M., Li, H., Ambikairajah, E.: An efficient and perceptually motivated auditory neural encoding and decoding algorithm for spiking neural networks. Front. Neurosci. 13 (2020). https://doi.org/10.3389/fnins.2019.01420

  17. Perez-Nieves, N., Leung, V.C.H., Dragotti, P.L., Goodman, D.F.M.: Neural heterogeneity promotes robust learning. Nature Commun. 12(1), 5791 (2021). https://doi.org/10.1038/s41467-021-26022-3

    Article  Google Scholar 

  18. Ponghiran, W., Roy, K.: Hybrid analog-spiking long short-term memory for energy efficient computing on edge devices. In: 2021 Design, Automation & Test in Europe Conference & Exhibition (DATE), pp. 581–586 (2021). https://doi.org/10.23919/DATE51398.2021.9473953

  19. Ravanelli, M., Brakel, P., Omologo, M., Bengio, Y.: Light gated recurrent units for speech recognition. IEEE Trans. Emerg. Topics Comput. Intell. 2(2), 92–102 (2018). https://doi.org/10.1109/TETCI.2017.2762739

    Article  Google Scholar 

  20. Shrestha, A., et al.: A spike-based long short-term memory on a neurosynaptic processor. In: 2017 IEEE/ACM International Conference on Computer-Aided Design (ICCAD), pp. 631–637 (2017). https://doi.org/10.1109/ICCAD.2017.8203836

  21. Sze, V., Chen, Y.H., Yang, T.J., Emer, J.S.: Efficient processing of deep neural networks: a tutorial and survey. Proc. IEEE 105(12), 2295–2329 (2017). https://doi.org/10.1109/JPROC.2017.2761740

    Article  Google Scholar 

  22. Yin, B., Corradi, F., Bohté, S.M.: Effective and efficient computation with multiple-timescale spiking recurrent neural networks. In: International Conference on Neuromorphic Systems 2020, pp. 1–8. ACM (2020). https://doi.org/10.1145/3407197.3407225

  23. Yin, B., Corradi, F., Bohté, S.M.: Accurate and efficient time-domain classification with adaptive spiking recurrent neural networks. Nat. Mach. Intell. 3(10), 905–913 (2021). https://doi.org/10.1038/s42256-021-00397-w

    Article  Google Scholar 

Download references

Acknowledgements

This work has been partially supported by MIAI @ Grenoble Alpes, (ANR-19-P3IA-0003).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manon Dampfhoffer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Dampfhoffer, M., Mesquida, T., Valentian, A., Anghel, L. (2022). Investigating Current-Based and Gating Approaches for Accurate and Energy-Efficient Spiking Recurrent Neural Networks. In: Pimenidis, E., Angelov, P., Jayne, C., Papaleonidas, A., Aydin, M. (eds) Artificial Neural Networks and Machine Learning – ICANN 2022. ICANN 2022. Lecture Notes in Computer Science, vol 13531. Springer, Cham. https://doi.org/10.1007/978-3-031-15934-3_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-15934-3_30

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-15933-6

  • Online ISBN: 978-3-031-15934-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics