Skip to main content

Gradient Descent Learning Algorithm Based on Spike Selection Mechanism for Multilayer Spiking Neural Networks

  • Conference paper
  • First Online:
Neural Information Processing (ICONIP 2021)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 13110))

Included in the following conference series:

  • 1624 Accesses

Abstract

Gradient descent is one of the significant research contents in supervised learning of spiking neural networks (SNNs). In order to improve the performance of gradient descent learning algorithms for multilayer SNNs, this paper proposes a spike selection mechanism to select optimal presynaptic spikes to participate in computing the change amount of synaptic weights during the process of weight adjustment. The proposed spike selection mechanism comprehensively considers the desired and actual output spikes of the network. The presynaptic spikes involved in the calculation are determined within a certain time interval, so that the network output spikes matches the desired output spikes perfectly as far as possible. The proposed spike selection mechanism is used for the gradient descent learning algorithm for multilayer SNNs. The experimental results show that our proposed mechanism can make the gradient descent learning algorithm for multilayer SNNs have higher learning accuracy, fewer learning epochs and shorten the running time. It indicates that the spike selection mechanism is very effective for improving gradient descent learning performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Ghosh-Dastidar, S., Adeli, H.: Spiking neural networks. Int. J. Neural Syst. 19(04), 295–308 (2009)

    Article  Google Scholar 

  2. Taherkhani, A., Belatreche, A., Li, Y., et al.: A review of learning in biologically plausible spiking neural networks. Neural Netw. 122, 253–272 (2020)

    Article  Google Scholar 

  3. Skatchkovsky, N., Jang, H., Simeone, O.: Spiking neural networks-part II: detecting spatio-temporal patterns. IEEE Commun. Lett. 25(6), 1741–1745 (2021)

    Article  Google Scholar 

  4. Kulkarni, S.R., Rajendran, B.: Spiking neural networks for handwritten digit recognition-supervised learning and network optimization. Neural Netw. 103, 118–127 (2018)

    Article  Google Scholar 

  5. Wang, X., Lin, X., Dang, X.: Supervised learning in spiking neural networks: a review of algorithms and evaluations. Neural Netw. 125, 258–280 (2020)

    Article  Google Scholar 

  6. Lin, X., Wang, X., Zhang, N., et al.: Supervised learning algorithms for spiking neural networks: a review. Acta Electron. Sin. 43(3), 577–586 (2015)

    Google Scholar 

  7. Lin, X., Wang, X.: Spiking Neural Networks: Principles and Applications. Science Press, China (2018)

    Google Scholar 

  8. Comsa, I., Potempa, K., Versari, L., et al.: Temporal coding in spiking neural networks with alpha synaptic function: learning with backpropagation. IEEE Trans. Neural Netw. Learn. Syst., 1–14 (2021)

    Google Scholar 

  9. Kheradpisheh, S., Masquelier, T.: Temporal backpropagation for spiking neural networks with one spike per neuron. Int. J. Neural Syst. 30(06), 2050027 (2020)

    Article  Google Scholar 

  10. Bohte, S.M., Kok, J.N., Poutré, H.: Error-backpropagation in temporally encoded networks of spiking neurons. Neurocomputing 48(1), 17–37 (2002)

    Article  Google Scholar 

  11. Zhao, J., Zurada, J.M., Yang, J., Wu, W.: The convergence analysis of SpikeProp algorithm with smoothing \({L_{1/2}}\) regularization. Neural Netw. 103, 19–28 (2018)

    Article  Google Scholar 

  12. Shrestha, S.B., Song, Q.: Robustness to training disturbances in SpikeProp learning. IEEE Trans. Neural Netw. Learn. Syst. 29(7), 3126–3139 (2018)

    Article  MathSciNet  Google Scholar 

  13. Booij, O., tat Nguyen, H.: A gradient descent rule for spiking neurons emitting multiple spikes. Inf. Process. Lett. 95(6), 552–558 (2005)

    Google Scholar 

  14. Xu, Y., Zeng, X., Han, L., Yang, J.: A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks. Neural Netw. 43, 99–113 (2013)

    Article  Google Scholar 

  15. Xu, Y., Yang, J., Zhong, S.: An online supervised learning method based on gradient descent for spiking neurons. Neural Netw. 93, 7–20 (2017)

    Article  Google Scholar 

  16. Gerstner, W., Kistler, W.M.: Spiking Neuron Models: Single Neurons, Populations, Plasticity. Cambridge University Press, Cambridgeshire (2002)

    Book  Google Scholar 

Download references

Acknowledgments

This work was supported by the National Natural Science Foundation of China under grant no. 61762080, the Key Research and Development Project of Gansu Province under grant no. 20YF8GA049, the Youth Science and Technology Fund Project of Gansu Province under grant no. 20JR10RA097, the Lanzhou Municipal Science and Technology Project under grant no. 2019-1-34.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xianghong Lin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lin, X., Hu, T., Wang, X., Lu, H. (2021). Gradient Descent Learning Algorithm Based on Spike Selection Mechanism for Multilayer Spiking Neural Networks. In: Mantoro, T., Lee, M., Ayu, M.A., Wong, K.W., Hidayanto, A.N. (eds) Neural Information Processing. ICONIP 2021. Lecture Notes in Computer Science(), vol 13110. Springer, Cham. https://doi.org/10.1007/978-3-030-92238-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-92238-2_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-92237-5

  • Online ISBN: 978-3-030-92238-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics