Skip to main content

Advertisement

Log in

Fine-tuning with local learning rules helps to compress and accelerate spiking neural networks without accuracy loss

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Spiking neural networks (SNNs) are believed to be highly energy- and computationally efficient machine learning algorithms, especially when implemented on neuromorphic hardware. Some recent studies have revealed that lateral (intralayer) inhibitory connectivity is necessary for effective and stable learning of SNNs. However, for large-scale SNNs, lateral inhibitory connections require an additional large amount of calculations. This negatively affects both the SNN inference time and the size of required computing resources. In this study, we propose a fine-tuning procedure using original local learning rules, called FEELING, to be applied to the weights of interneuron sublayer, which is introduced for organizing more efficient competition between excitatory neurons. At the same time, the initialization of interneuron weight values is implemented by singular value decomposition of intralayer inhibitory weight matrix characterizing the original SNN architecture before optimization. The proposed procedure allows to compress and accelerate large-scale SNNs with lateral inhibition in the layers, alongside with maintenance of their recognition accuracy, as it is shown on MNIST dataset. We demonstrate that this new optimization technique is superior to simple pruning of inhibitory connections, even also followed by fine-tuning. Moreover, this method of fine-tuned decomposition suggests the association of excitatory and inhibitory functions to two different sublayers of neurons, as it is naturally observed in a biological neural system. We hope that findings of this study not only reveal some new aspects of the effective computation and bio-plausible architecture for SNNs but also assume a hypothetic reason for the evolutionary preference of inhibitory neurons over inhibitory connections.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Tavanaei A et al (2018) Deep learning in spiking neural networks. Neural Netw 111:47–63

    Article  Google Scholar 

  2. Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. Adv Neural Inf Process Syst 25:1097–1105

    Google Scholar 

  3. LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521:436–444

    Article  Google Scholar 

  4. Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117

    Article  Google Scholar 

  5. Merolla PA et al (2014) A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345(6197):668–673

    Article  Google Scholar 

  6. Bi G, Poo M (1998) Synaptic modifications in cultured hippocampal neurons: dependence on spike timing, synaptic strength, and postsynaptic cell type. J Neurosci 18(24):10464–10472

    Article  Google Scholar 

  7. Diehl P, Cook M (2015) Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front Comput Neurosci 9:99

    Article  Google Scholar 

  8. Demin V, Nekhaev D (2018) Recurrent spiking neural network learning based on a competitive maximization of neuronal activity. Front Neuroinform 12:79

    Article  Google Scholar 

  9. Nekhaev D, Demin V (2020) Competitive maximization of neuronal activity in convolutional recurrent spiking neural networks. In: Kryzhanovsky B, Dunin-Barkowski W, Redko V, Tiumentsev Y (eds) Advances in neural computation, machine learning, and cognitive research III. Neuroinformatics 2019. Studies in computational intelligence, vol 856. Springer, Cham

    Google Scholar 

  10. Wysoski SG, Benuskova L, Kasabov N (2010) Evolving spiking neural networks for audiovisual information processing. Neural Netw 23:819–835

    Article  Google Scholar 

  11. Gupta A, Long LN (2007) Character recognition using spiking neural networks. In: Neural networks (IJCNN). international joint conference on IEEE, pp. 53–58

  12. Meftah B, Lezoray O, Benyettou A (2010) Segmentation and edge detection based on spiking neural network model. Neural Process Lett 32:131–146

    Article  Google Scholar 

  13. Escobar M-J, Masson GS, Vieville T, Kornprobst P (2009) Action recognition using a bio-inspired feedforward spiking network. Int J Comput Vis 82:284–301

    Article  Google Scholar 

  14. Kroger BJ, Kannampuzha J, Neuschaefer-Rube C (2009) Towards a neurocomputational model of speech production and perception. Speech Commun 51:793–809

    Article  Google Scholar 

  15. Tavanaei A, Maida A (2017) Bio-inspired multi-layer spiking neural network extracts discriminative features from speech signals. In: International conference on neural information processing, Springer, pp. 899–908

  16. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508

    Article  Google Scholar 

  17. Liu T et al. (2015) Mt-spike: a multilayer time-based spiking neuromorphic architecture with temporal error backpropagation. In: Proceedings of the 36th international conference on computer-aided design

  18. Diehl PU et al. (2015) Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In: Neural networks (IJCNN), international joint conference on IEEE, pp. 1–8

  19. Esser SK et al (2015) Backpropagation for energy-efficient neuromorphic computing. Adv Neural Inf Process Syst 28:1117–1125

    Google Scholar 

  20. Kheradpisheh SR, Ganjtabesh M, Thorpe SJ, Masquelier T (2018) Stdp-based spiking deep convolutional neural networks for object recognition. Neural Netw 99:56–57

    Article  Google Scholar 

  21. Tavanaei A, Kirby Z, Maida AS (2018) Training spiking ConvNets by STDP and gradient descent. In Neural networks (IJCNN), The 2018 international joint conference on IEEE, pp. 1–8

  22. Demin V, Nekhaev D et al (2021) Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. Neural Netw 134:64–75

    Article  Google Scholar 

  23. Bill J et al (2015) Distributed bayesian computation and self-organized learning in sheets of spiking neurons with local lateral inhibition. PLoS One 10(8):e0134356

    Article  Google Scholar 

  24. Faust T, Assous M, Tepper JM, Koós T (2016) Neostriatal gabaergic interneurons mediate cholinergic inhibition of spiny projection neurons. J Neurosci 36:9505–9511

    Article  Google Scholar 

  25. Gabott PL, Somogyi P (1986) Quantitative distribution of GABA-immunoreactive neurons in the visual cortex (area 17) of the cat. Exp Brain Res 61:323–331

    Google Scholar 

  26. LeCun Y, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86(11):2278–2324

    Article  Google Scholar 

  27. Maass W, Bishop CM (1999) Pulsed neural networks. MIT Press, Massachusetts, p 275

    Google Scholar 

  28. Girshick R (2015) Fast R-CNN. In: IEEE international conference on computer vision (ICCV), pp. 1440–1448

  29. Xue J et al (2013) Restructuring of deep neural network acoustic models with singular value decomposition. In: Interspeech

  30. Denton EL et al (2014) Exploiting linear structure within convolutional networks for efficient evaluation. Adv Neural Inf Proc Sys 27

  31. Lebedev V et al (2015) Speeding-up convolutional neural networks using fine-tuned CP-decomposition. In: ICLR

  32. Novikov A et al (2015) Tensorizing neural networks. Adv Neural Inf Process Syst 28:442–450

    Google Scholar 

  33. Song H et al (2015) A deep neural network compression pipeline: pruning, quantization, Huffman encoding. In: NIPS

  34. Anwar S et al (2015) Structured pruning of deep convolutional neural networks. ACM J Emerg Technol Comput Syst 13:1–18

    Article  Google Scholar 

  35. Rathi N, Panda P, Roy K (2018) STDP based pruning of connections and weight quantization in spiking neural networks for energy-efficient recognition. IEEE Trans Comput Aided Des Integr Circuits Syst 38:668–677

    Article  Google Scholar 

  36. Shi Y, Nguyen L, Oh S, Liu X, Kuzum D (2019) A soft-pruning method applied during training of spiking neural networks for in-memory computing applications. Front Neurosci 13:405

    Article  Google Scholar 

Download references

Acknowledgements

This work was conducted using the computing resources of the federal collective usage center, Complex for Simulation and Data Processing for Mega-science Facilities at NRC “Kurchatov Institute” (http://ckp.nrcki.ru/), and was supported by RFBR # 18-29-23041\20 in part of local fine-tuning experiments for SNN decomposed architecture.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to D. V. Nekhaev.

Ethics declarations

Conflict of interest

Authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nekhaev, D.V., Demin, V.A. Fine-tuning with local learning rules helps to compress and accelerate spiking neural networks without accuracy loss. Neural Comput & Applic 34, 20687–20700 (2022). https://doi.org/10.1007/s00521-022-07513-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07513-w

Keywords

Navigation