Skip to main content

A Preliminary Analysis on Software Frameworks for the Development of Spiking Neural Networks

  • Conference paper
  • First Online:
Hybrid Artificial Intelligent Systems (HAIS 2021)

Abstract

Today, the energy resources used by machine learning methods, especially those based on deep neural networks, pose a serious climate problem. To reduce the energy footprint of these systems, the study and development of energy-efficient neural networks is increasing enormously. Among the different existing proposals, spiking neural networks are a promising alternative to achieve this goal. These methods use activation functions based on sparse binary spikes over time that allow for a significant reduction in energy consumption. However, one of the main drawbacks of these networks is that these activation functions are not derivable, which prevents their direct training in traditional neural network development software. Due to this limitation, the community has developed different training methods for these networks, together with different libraries that implement them. In this paper, different libraries for the development and training of these networks are analysed. Their main features are highlighted with the aim of helping researchers and practitioners in the decision making process regarding the development of spiking neural networks according to their needs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bekolay, T., et al.: Nengo: a Python tool for building large-scale functional brain models. Front. Neuroinform. 7, 48 (2014)

    Article  Google Scholar 

  2. Bellec, G., Salaj, D., Subramoney, A., Legenstein, R., Maass, W.: Long short-term memory and learning-to-learn in networks of spiking neurons. In: Advances in Neural Information Processing Systems, pp. 787–797 (2018)

    Google Scholar 

  3. Bengio, Y., Lee, D.H., Bornschein, J., Mesnard, T., Lin, Z.: Towards biologically plausible deep learning. arXiv preprint arXiv:1502.04156 (2015)

  4. Bienenstock, E.L., Cooper, L.N., Munro, P.W.: Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci. 2(1), 32–48 (1982)

    Article  Google Scholar 

  5. Blouw, P., Eliasmith, C.: Event-driven signal processing with neuromorphic computing systems. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 8534–8538 (2020)

    Google Scholar 

  6. Bohte, S.M., Kok, J.N., La Poutré, H.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1–4), 17–37 (2002)

    Article  Google Scholar 

  7. Diehl, P.U., Neil, D., Binas, J., Cook, M., Liu, S.C., Pfeiffer, M.: Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2015)

    Google Scholar 

  8. Eliasmith, C., Anderson, C.H.: Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems. MIT Press, Cambridge (2003)

    Google Scholar 

  9. Feldman, D.E.: The spike-timing dependence of plasticity. Neuron 75(4), 556–571 (2012)

    Article  Google Scholar 

  10. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)

    Book  Google Scholar 

  11. Gewaltig, M.O., Diesmann, M.: Nest (neural simulation tool). Scholarpedia 2(4), 1430 (2007)

    Article  Google Scholar 

  12. Hazan, H., et al.: Bindsnet: a machine learning-oriented spiking neural networks library in python. Front. Neuroinform. 12, 89 (2018)

    Article  Google Scholar 

  13. Hebb, D.O.: The Organization of Behavior: A Neuropsycholocigal Theory. Wiley, Hoboken (1949)

    Google Scholar 

  14. Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117(4), 500–544 (1952)

    Article  Google Scholar 

  15. Hunsberger, E., Eliasmith, C.: Training spiking deep networks for neuromorphic hardware. arXiv preprint arXiv:1611.05141 (2016)

  16. Izhikevich, E.M.: Dynamical Systems in Neuroscience. MIT Press, Cambridge (2007)

    Google Scholar 

  17. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems 25, pp. 1097–1105 (2012)

    Google Scholar 

  18. Liu, C., et al.: Memory-efficient deep learning on a SpiNNaker 2 prototype. Front. Neurosci. 12, 840 (2018)

    Article  Google Scholar 

  19. Maass, W.: Networks of spiking neurons: the third generation of neural network models. Neural Netw. 10(9), 1659–1671 (1997)

    Article  Google Scholar 

  20. Nayak, P., Zhang, D., Chai, S.: Bit efficient quantization for deep neural networks. arXiv preprint arXiv:1910.04877 (2019)

  21. Oja, E.: Simplified neuron model as a principal component analyzer. J. Math. Biol. 15(3), 267–273 (1982). https://doi.org/10.1007/BF00275687

    Article  MathSciNet  MATH  Google Scholar 

  22. Painkras, E., et al.: SpiNNaker: a 1-W 18-core system-on-chip for massively-parallel neural network simulation. IEEE J. Solid-State Circuits 48(8), 1943–1953 (2013)

    Article  Google Scholar 

  23. Pehle, C., Pedersen, J.E.: Norse - a deep learning library for spiking neural networks, January 2021, documentation: https://norse.ai/docs/

  24. Pérez-Carrasco, J.A., et al.: Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward convnets. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2706–2719 (2013)

    Article  Google Scholar 

  25. Petro, B., Kasabov, N., Kiss, R.M.: Selection and optimization of temporal spike encoding methods for spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 31(2), 358–370 (2020)

    Article  Google Scholar 

  26. Rasmussen, D.: NengoDL: combining deep learning and neuromorphic modelling methods. Neuroinformatics 17(4), 611–628 (2019). https://doi.org/10.1007/s12021-019-09424-z

    Article  Google Scholar 

  27. Rueckauer, B., Lungu, I.A., Hu, Y., Pfeiffer, M., Liu, S.C.: Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11(DEC), 682 (2017)

    Article  Google Scholar 

  28. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)

    Article  Google Scholar 

  29. Schwartz, R., Dodge, J., Smith, N.A., Etzioni, O.: Green AI. Commun. ACM 63(12), 54–63 (2020)

    Article  Google Scholar 

  30. Silver, D., et al.: Mastering the game of go without human knowledge. Nature 550(7676), 354–359 (2017)

    Article  Google Scholar 

  31. Stimberg, M., Brette, R., Goodman, D.F.: Brian 2, an intuitive and efficient neural simulator. Elife 8, e47314 (2019)

    Article  Google Scholar 

  32. Sze, V., Chen, Y.H., Yang, T.J., Emer, J.S.: Efficient processing of deep neural networks: a tutorial and survey. Proc. IEEE 105(12), 2295–2329 (2017)

    Article  Google Scholar 

  33. Vitay, J., Dinkelbach, H.Ü., Hamker, F.H.: ANNarchy: a code generation approach to neural simulations on parallel hardware. Front. Neuroinform. 9, 19 (2015)

    Article  Google Scholar 

  34. Wang, X., Lin, X., Dang, X.: Supervised learning in spiking neural networks: a review of algorithms and evaluations. Neural Netw. 125, 258–280 (2020)

    Article  Google Scholar 

  35. Zenke, F., Ganguli, S.: SuperSpike: supervised learning in multilayer spiking neural networks. Neural Comput. 30(6), 1514–1541 (2018)

    Article  MathSciNet  Google Scholar 

  36. Zhang, Y., Suda, N., Lai, L., Chandra, V.: Hello edge: keyword spotting on microcontrollers. ArXiv preprint ArXiv:1711.07128 (2018)

Download references

Acknowledgments

This work has been supported by the Regional Government of Andalusia, under the program “Personal Investigador Doctor”, reference DOC_00235.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ángel M. García-Vico .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

García-Vico, Á.M., Herrera, F. (2021). A Preliminary Analysis on Software Frameworks for the Development of Spiking Neural Networks. In: Sanjurjo González, H., Pastor López, I., García Bringas, P., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2021. Lecture Notes in Computer Science(), vol 12886. Springer, Cham. https://doi.org/10.1007/978-3-030-86271-8_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-86271-8_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-86270-1

  • Online ISBN: 978-3-030-86271-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics