Skip to main content
Log in

Learning improvement of spiking neural networks with dynamic adaptive hyperparameter neurons

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Spiking neural networks (SNNs) which use spiking neurons as a component, have shown substantial promise in simulating biological neuron mechanisms and saving computing power. However, preset or suboptimal hyperparameters are still used for spiking neurons adopted in SNNs, and the heterogeneity of neurons is limited, limiting SNNs inference accuracy. Inspired by neuroscience observations that hyperparameters are related to the membrane potential in neurons, in this paper, a new module for implementing adaptive hyperparameters dynamically is proposed, enabling flexible hyperparameters to be obtained for spiking neurons. In addition, inspired by neuroscience observations that heterogeneity in current drive force of synaptic integration process, we propose a new module to distribute synaptic driving force factors in neurons to maximize synaptic integration rationalization. Utilizing these methods enables SNNs to have fast convergence capability and appropriate inference of neuron dynamics. Finally, the effects of the proposed adaptive hyperparameters and driving force distribution mechanism are evaluated on different datasets. The results show that SNNs with our methods have improved accuracy on all test datasets, exhibit robustness to different initial hyperparameters, and exhibit more realistic biological behavior.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data Availability

All data generated or analysed during this study are included in this published article.

References

  1. Tavanaei A, Ghodrati M, Kheradpisheh SR, Masquelier T, Maida A (2019) Deep learning in spiking neural networks. Neural Netw 111:47–63

    Article  Google Scholar 

  2. Zhang M, Qu H, Belatreche A, Chen Y, Yi Z (2018) A highly effective and robust membrane potential-driven supervised learning method for spiking neurons. IEEE transactions on neural networks and learning systems 30(1):123–137

    Article  Google Scholar 

  3. Chen T, Wang S, Gong Y, Wang L, Duan S (2023) Surrogate gradient scaling for directly training spiking neural networks. Appl Intell 53(23):27966–27981

    Article  Google Scholar 

  4. Meng L, Qiao G, Zhang X, Bai J, Zuo Y, Zhou P, Liu Y, Hu S (2023) “An efficient pruning and fine-tuning method for deep spiking neural network. Applied Intelligence, pp. 1–14

  5. Xu Q, Qi Y, Yu H, Shen J, Tang H, Pan G et al (2018) “Csnn: an augmented spiking based framework with perceptron-inception.” In IJCAI, vol. 1646

  6. Pfeiffer M, Pfeil T (2018) Deep learning with spiking neurons: Opportunities and challenges. Front Neurosci 12:774

    Article  Google Scholar 

  7. Yao M, Zhao G, Zhang H, Hu Y, Deng L, Tian Y, Xu B, Li G (2022) “Attention spiking neural networks.” arXiv:2209.13929

  8. Huh D, Sejnowski TJ (2018) “Gradient descent for spiking neural networks.” Advances in neural information processing systems, vol. 31

  9. Wu Y, Deng L, Li G, Zhu J, Shi L (2018) Spatio-temporal backpropagation for training high-performance spiking neural networks. Front Neurosci 12:331

    Article  Google Scholar 

  10. Lin Y, Hu Y, Ma S, Yu D, Li G (2022) “Rethinking pretraining as a bridge from anns to snns.” IEEE Transactions on Neural Networks and Learning Systems

  11. Shen G, Zhao D, Zeng Y (2024) Exploiting nonlinear dendritic adaptive computation in training deep spiking neural networks. Neural Netw 170:190–201

    Article  Google Scholar 

  12. Otomo K, Perkins J, Kulkarni A, Stojanovic S, Roeper J, Paladini CA (2020) In vivo patch-clamp recordings reveal distinct subthreshold signatures and threshold dynamics of midbrain dopamine neurons. Nat Commun 11(1):6286

    Article  Google Scholar 

  13. Wester JC, Contreras D (2013) Biophysical mechanism of spike threshold dependence on the rate of rise of the membrane potential by sodium channel inactivation or subthreshold axonal potassium current. J Comput Neurosci 35:1–17

    Article  MathSciNet  Google Scholar 

  14. McGinley MJ, Vinck M, Reimer J, Batista-Brito R, Zagha E, Cadwell CR, Tolias AS, Cardin JA, McCormick DA (2015) Waking state: rapid variations modulate neural and behavioral responses. Neuron 87(6):1143–1161

    Article  Google Scholar 

  15. Khan GM, Khan GM (2018) The biology of brain: An insight into the human brain. In search of learning genes, Evolution of Artificial Neural Development, pp 9–28

    Google Scholar 

  16. Perez-Nieves N, Leung VC, Dragotti PL, Goodman DF (2021) Neural heterogeneity promotes robust learning. Nat Commun 12(1):5791

    Article  Google Scholar 

  17. Zeldenrust F, Gutkin B, Denéve S (2021) Efficient and robust coding in heterogeneous recurrent networks. PLoS Comput Biol 17(4):e1008673

    Article  Google Scholar 

  18. Yin B, Corradi F, Bohté SM (2023) “Accurate online training of dynamical spiking neural networks through forward propagation through time.” Nature Machine Intelligence, pp. 1–10

  19. Bohte SM, Kok JN, La Poutre H (2002) Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1–4):17–37

    Article  Google Scholar 

  20. Lee JH, Delbruck T, Pfeiffer M (2016) Training deep spiking neural networks using backpropagation. Front Neurosci 10:508

    Article  Google Scholar 

  21. Wu Y, Deng L, Li G, Zhu J, Xie Y, Shi L (2019) Direct training for spiking neural networks: Faster, larger, better. In Proceedings of the AAAI conference on artificial intelligence 33(01):1311–1318

    Article  Google Scholar 

  22. Cheng X, Hao Y, Xu J, Xu B (2020) Lisnn: Improving spiking neural networks with lateral interactions for robust object recognition. In IJCAI. Yokohama 2020:1519–1525

    Google Scholar 

  23. Zhang D, Zhang T, Jia S, Xu B (2022) Multi-sacle dynamic coding improved spiking actor network for reinforcement learning. In Proceedings of the AAAI Conference on Artificial Intelligence 36(1):59–67

    Article  Google Scholar 

  24. Zhou Y, Zhang A (2021) Improved integrate-and-fire neuron models for inference acceleration of spiking neural networks. Appl Intell 51(4):2393–2405

    Article  Google Scholar 

  25. Fekete A, Nakamura Y, Yang YM, Herlitze S, Mark MD, DiGregorio DA, Wang LY (2019) Underpinning heterogeneity in synaptic transmission by presynaptic ensembles of distinct morphological modules. Nat Commun 10(1):826

    Article  Google Scholar 

  26. Wu S, Zhang Y, Cui Y, Li H, Wang J, Guo L, Xia Y, Yao D, Xu P, Guo D (2019) Heterogeneity of synaptic input connectivity regulates spike-based neuronal avalanches. Neural Netw 110:91–103

    Article  Google Scholar 

  27. Bao H, Zhang J, Wang N, Kuznetsov N, Bao B (2022)“Adaptive synapse-based neuron model with heterogeneous multistability and riddled basins.” Chaos: An Interdisciplinary Journal of Nonlinear Science, vol. 32, no. 12

  28. Roy K, Jaiswal A, Panda P (2019) Towards spike-based machine intelligence with neuromorphic computing. Nature 575(7784):607–617

    Article  Google Scholar 

  29. Rueckauer B, Lungu IA, Hu Y, Pfeiffer M, Liu SC (2017) Conversion of continuous-valued deep networks to efficient event-driven networks for image classification. Front Neurosci 11:682

    Article  Google Scholar 

  30. Huang C, Resnik A, Celikel T, Englitz B (2016) Adaptive spike threshold enables robust and temporally precise neuronal encoding. PLoS Comput Biol 12(6):e1004984

    Article  Google Scholar 

  31. Guo Y, Zj Su, Yk Chen, Chai Z (2017) Brain-derived neurotrophic factor/neurotrophin 3 regulate axon initial segment location and affect neuronal excitability in cultured hippocampal neurons. J Neurochem 142(2):260–271

    Article  Google Scholar 

  32. Yi GS, Wang J, Tsang KM, Wei XL, Deng B (2015) Input-output relation and energy efficiency in the neuron with different spike threshold dynamics. Front Comput Neurosci 9:62

    Article  Google Scholar 

  33. Spruston N (2008) Pyramidal neurons: dendritic structure and synaptic integration. Nat Rev Neurosci 9(3):206–221

    Article  MathSciNet  Google Scholar 

  34. Fang H, Shrestha A, Zhao Z, Qiu Q (2020)“Exploiting neuron and synapse filter dynamics in spatial temporal learning of deep spiking neural network.” In Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence

  35. Marchisio A, Pira G, Martina M, Masera G, Shafique M (2021) “Dvs-attacks: Adversarial attacks on dynamic vision sensors for spiking neural networks.” In 2021 International Joint Conference on Neural Networks (IJCNN). IEEE, pp. 1–9

  36. Li Y, Dong Y, Zhao D, Zeng Y (2022) N-omniglot, a large-scale neuromorphic dataset for spatio-temporal sparse few-shot learning. Scientific Data 9(1):746

    Article  Google Scholar 

  37. Kim Y, Park H, Moitra A, Bhattacharjee A, Venkatesha Y, Panda P (2022) “Rate coding or direct coding: Which one is better for accurate, robust, and energy-efficient spiking neural networks?” In ICASSP 2022-2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). IEEE, 2022, pp. 71–75

  38. Zhang W, Li P (2020) “Temporal spike sequence learning via backpropagation for deep spiking neural networks.” Advances in Neural Information Processing Systems, vol. 33, pp. 12 022–12 033

  39. Chen T, Wang L, Li J, Duan S, Huang T (2023) “Improving spiking neural network with frequency adaptation for image classification.” IEEE Transactions on Cognitive and Developmental Systems

  40. Pei Y, Xu C, Wu Z, Liu Y, Yang Y (2023) “Albsnn: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator.” Frontiers in Neuroscience, vol. 17

  41. Rathi N, Srinivasan G, Panda P, Roy K (2019) Enabling deep spiking neural networks with hybrid conversion and spike timing dependent backpropagation

  42. Guo Y, Zhang L, Chen Y, Tong X, Liu X, Wang Y, Huang X, Ma Z (2022) “Real spike: Learning real-valued spikes for spiking neural networks.” In European Conference on Computer Vision. Springer, 2022, pp. 52–68

  43. Shen G, Zhao D, Zeng Y (2022) “Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks.” Patterns, vol. 3, no. 6

  44. He W, Wu Y, Deng L, Li G, Wang H, Tian Y, Ding W, Wang W, Xie Y (2020) Comparing snns and rnns on neuromorphic vision datasets: Similarities and differences. Neural Netw 132:108–120

    Article  Google Scholar 

  45. Wu Z, Zhang H, Lin Y, Li G, Wang M, Tang Y (2021) Liaf-net: Leaky integrate and analog fire network for lightweight and efficient spatiotemporal information processing. IEEE Transactions on Neural Networks and Learning Systems 33(11):6249–6262

    Article  Google Scholar 

  46. Horowitz M (2014) “1.1 computing’s energy problem (and what we can do about it)’’. In 2014 IEEE international solid-state circuits conference digest of technical papers (ISSCC). IEEE 2014:10–14

    Google Scholar 

Download references

Funding

This work was supported by Zhejiang Key Research and Development Project (2022C01048)

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: [Jiakai Liang]; Methodology: [Jiakai Liang]; Formal analysis and investigation: [Chao Wang]; Writing original draft preparation: [Jiakai Liang, Chao Wang]; Writing review and editing: [De Ma, Ruixue Li, Keqiang Yue]; Funding acquisition: [Wenjun Li]; Resources: [Wenjun Li]; Supervision: [Wenjun Li]

Corresponding author

Correspondence to Wenjun Li.

Ethics declarations

Competing Interests

The authors declare that there is no conflict of interest regarding the publication of this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liang, J., Wang, C., Ma, D. et al. Learning improvement of spiking neural networks with dynamic adaptive hyperparameter neurons. Appl Intell 54, 9158–9176 (2024). https://doi.org/10.1007/s10489-024-05629-1

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-024-05629-1

Keywords