Loading [a11y]/accessibility-menu.js
Accelerating Spatiotemporal Supervised Training of Large-Scale Spiking Neural Networks on GPU | IEEE Conference Publication | IEEE Xplore

Accelerating Spatiotemporal Supervised Training of Large-Scale Spiking Neural Networks on GPU


Abstract:

Spiking neural networks (SNNs) have great potential to achieve brain-like intelligence, however, it suffers low accuracy of conventional synaptic plasticity rules and low...Show More

Abstract:

Spiking neural networks (SNNs) have great potential to achieve brain-like intelligence, however, it suffers low accuracy of conventional synaptic plasticity rules and low training efficiency on GPUs. Recently, the emerging backpropagation through time (BPTT) inspired learning algorithms bring new opportunities to boost the accuracy of SNNs, while training on GPUs still remains inefficient due to the complex spatiotemporal dynamics and huge memory consumption, which restricts the model exploration for SNNs and prevents the advance of neuromorphic computing. In this work, we build a framework to solve the inefficiency of BPTT-based SNN training on modern GPUs. To reduce the memory consumption, we optimize the dataflow by saving CONV/FC results only in the forward pass and recomputing other intermediate results in the backward pass. Then, we customize kernel functions to accelerate the neural dynamics for all training stages. Finally, we provide a Pytorch interface to make our framework easy-to-deploy in real systems. Compared to vanilla Pytorch implementation, our framework can achieve up to 2.13 x end-to-end speedup and consume only 0.41 x peak memory on the CIFAR10 dataset. Moreover, for the distributed training on the large ImageNet dataset, we can achieve up to 1.81 x end-to-end speedup and consume only 0.38 x peak memory.
Date of Conference: 14-23 March 2022
Date Added to IEEE Xplore: 19 May 2022
ISBN Information:

ISSN Information:

Conference Location: Antwerp, Belgium

Contact IEEE to Subscribe

References

References is not available for this document.