Abstract
Today, the energy resources used by machine learning methods, especially those based on deep neural networks, pose a serious climate problem. To reduce the energy footprint of these systems, the study and development of energy-efficient neural networks is increasing enormously. Among the different existing proposals, spiking neural networks are a promising alternative to achieve this goal. These methods use activation functions based on sparse binary spikes over time that allow for a significant reduction in energy consumption. However, one of the main drawbacks of these networks is that these activation functions are not derivable, which prevents their direct training in traditional neural network development software. Due to this limitation, the community has developed different training methods for these networks, together with different libraries that implement them. In this paper, different libraries for the development and training of these networks are analysed. Their main features are highlighted with the aim of helping researchers and practitioners in the decision making process regarding the development of spiking neural networks according to their needs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bekolay, T., et al.: Nengo: a Python tool for building large-scale functional brain models. Front. Neuroinform. 7, 48 (2014)
Bellec, G., Salaj, D., Subramoney, A., Legenstein, R., Maass, W.: Long short-term memory and learning-to-learn in networks of spiking neurons. In: Advances in Neural Information Processing Systems, pp. 787–797 (2018)
Bengio, Y., Lee, D.H., Bornschein, J., Mesnard, T., Lin, Z.: Towards biologically plausible deep learning. arXiv preprint arXiv:1502.04156 (2015)
Bienenstock, E.L., Cooper, L.N., Munro, P.W.: Theory for the development of neuron selectivity: orientation specificity and binocular interaction in visual cortex. J. Neurosci. 2(1), 32–48 (1982)
Blouw, P., Eliasmith, C.: Event-driven signal processing with neuromorphic computing systems. In: IEEE International Conference on Acoustics, Speech and Signal Processing, pp. 8534–8538 (2020)
Bohte, S.M., Kok, J.N., La Poutré, H.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1–4), 17–37 (2002)
Diehl, P.U., Neil, D., Binas, J., Cook, M., Liu, S.C., Pfeiffer, M.: Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: 2015 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2015)
Eliasmith, C., Anderson, C.H.: Neural Engineering: Computation, Representation, and Dynamics in Neurobiological Systems. MIT Press, Cambridge (2003)
Feldman, D.E.: The spike-timing dependence of plasticity. Neuron 75(4), 556–571 (2012)
Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridge (2002)
Gewaltig, M.O., Diesmann, M.: Nest (neural simulation tool). Scholarpedia 2(4), 1430 (2007)
Hazan, H., et al.: Bindsnet: a machine learning-oriented spiking neural networks library in python. Front. Neuroinform. 12, 89 (2018)
Hebb, D.O.: The Organization of Behavior: A Neuropsycholocigal Theory. Wiley, Hoboken (1949)
Hodgkin, A.L., Huxley, A.F.: A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117(4), 500–544 (1952)
Hunsberger, E., Eliasmith, C.: Training spiking deep networks for neuromorphic hardware. arXiv preprint arXiv:1611.05141 (2016)
Izhikevich, E.M.: Dynamical Systems in Neuroscience. MIT Press, Cambridge (2007)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems 25, pp. 1097–1105 (2012)
Liu, C., et al.: Memory-efficient deep learning on a SpiNNaker 2 prototype. Front. Neurosci. 12, 840 (2018)
Maass, W.: Networks of spiking neurons: the third generation of neural network models. Neural Netw. 10(9), 1659–1671 (1997)
Nayak, P., Zhang, D., Chai, S.: Bit efficient quantization for deep neural networks. arXiv preprint arXiv:1910.04877 (2019)
Oja, E.: Simplified neuron model as a principal component analyzer. J. Math. Biol. 15(3), 267–273 (1982). https://doi.org/10.1007/BF00275687
Painkras, E., et al.: SpiNNaker: a 1-W 18-core system-on-chip for massively-parallel neural network simulation. IEEE J. Solid-State Circuits 48(8), 1943–1953 (2013)
Pehle, C., Pedersen, J.E.: Norse - a deep learning library for spiking neural networks, January 2021, documentation: https://norse.ai/docs/
Pérez-Carrasco, J.A., et al.: Mapping from frame-driven to frame-free event-driven vision systems by low-rate rate coding and coincidence processing-application to feedforward convnets. IEEE Trans. Pattern Anal. Mach. Intell. 35(11), 2706–2719 (2013)
Petro, B., Kasabov, N., Kiss, R.M.: Selection and optimization of temporal spike encoding methods for spiking neural networks. IEEE Trans. Neural Netw. Learn. Syst. 31(2), 358–370 (2020)
Rasmussen, D.: NengoDL: combining deep learning and neuromorphic modelling methods. Neuroinformatics 17(4), 611–628 (2019). https://doi.org/10.1007/s12021-019-09424-z
Rueckauer, B., Lungu, I.A., Hu, Y., Pfeiffer, M., Liu, S.C.: Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front. Neurosci. 11(DEC), 682 (2017)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323(6088), 533–536 (1986)
Schwartz, R., Dodge, J., Smith, N.A., Etzioni, O.: Green AI. Commun. ACM 63(12), 54–63 (2020)
Silver, D., et al.: Mastering the game of go without human knowledge. Nature 550(7676), 354–359 (2017)
Stimberg, M., Brette, R., Goodman, D.F.: Brian 2, an intuitive and efficient neural simulator. Elife 8, e47314 (2019)
Sze, V., Chen, Y.H., Yang, T.J., Emer, J.S.: Efficient processing of deep neural networks: a tutorial and survey. Proc. IEEE 105(12), 2295–2329 (2017)
Vitay, J., Dinkelbach, H.Ü., Hamker, F.H.: ANNarchy: a code generation approach to neural simulations on parallel hardware. Front. Neuroinform. 9, 19 (2015)
Wang, X., Lin, X., Dang, X.: Supervised learning in spiking neural networks: a review of algorithms and evaluations. Neural Netw. 125, 258–280 (2020)
Zenke, F., Ganguli, S.: SuperSpike: supervised learning in multilayer spiking neural networks. Neural Comput. 30(6), 1514–1541 (2018)
Zhang, Y., Suda, N., Lai, L., Chandra, V.: Hello edge: keyword spotting on microcontrollers. ArXiv preprint ArXiv:1711.07128 (2018)
Acknowledgments
This work has been supported by the Regional Government of Andalusia, under the program “Personal Investigador Doctor”, reference DOC_00235.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
García-Vico, Á.M., Herrera, F. (2021). A Preliminary Analysis on Software Frameworks for the Development of Spiking Neural Networks. In: Sanjurjo González, H., Pastor López, I., García Bringas, P., Quintián, H., Corchado, E. (eds) Hybrid Artificial Intelligent Systems. HAIS 2021. Lecture Notes in Computer Science(), vol 12886. Springer, Cham. https://doi.org/10.1007/978-3-030-86271-8_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-86271-8_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86270-1
Online ISBN: 978-3-030-86271-8
eBook Packages: Computer ScienceComputer Science (R0)