loading page

STDG: Fast and Lightweight SNN Training Technique Using Spike Temporal Locality
  • +3
  • Zhengyu Cai,
  • Hamid Rahimian Kalatehbali,
  • Ben Walters,
  • Mostafa Rahimi Azghadi,
  • Amirali Amirsoleimani,
  • Roman Genov
Zhengyu Cai
Department of Electrical and Computer Engineering, University of Toronto
Hamid Rahimian Kalatehbali
Department of Electrical Engineering and Computer Science, York University
Ben Walters
College of Science and Engineering, James Cook University
Mostafa Rahimi Azghadi
College of Science and Engineering, James Cook University
Amirali Amirsoleimani
Department of Electrical Engineering and Computer Science, York University

Corresponding Author:[email protected]

Author Profile
Roman Genov
Department of Electrical and Computer Engineering, University of Toronto

Abstract

Spiking neural networks (SNNs) possess biological plausibility and energy efficiency as they communicate using asynchronous and mostly sparse spikes. These features make them an ideal choice for efficient neuromorphic computing. The non-differentiable, discrete binary spike events transmitted in SNNs pose a challenge for applying gradient-based optimization algorithms directly to these networks. Therefore, efficient techniques are necessary to enhance energy efficiency without sacrificing accuracy. In this work, we propose Spike Timing Dependent Gradient (STDG), a fast and lightweight learning scheme that uses temporal locality among spikes to avoid non-differentiable derivatives. Our experiments show that STDG reaches the state-of-the-art accuracy of 99.5% and 98.2% on the Caltech101 face/motorbike and the MNIST datasets, respectively.