Loading [a11y]/accessibility-menu.js
NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons | IEEE Conference Publication | IEEE Xplore

NormAD - Normalized Approximate Descent based supervised learning rule for spiking neurons


Abstract:

NormAD is a novel supervised learning algorithm to train spiking neurons to produce a desired spike train in response to a given input. It is shown that NormAD provides f...Show More

Abstract:

NormAD is a novel supervised learning algorithm to train spiking neurons to produce a desired spike train in response to a given input. It is shown that NormAD provides faster convergence than state-of-the-art supervised learning algorithms for spiking neurons, often the gain in the rate of convergence being more than a factor of 10. The algorithm leverages the fact that a leaky integrate-and-fire neuron can be described as a non-linear spatio-temporal filter, allowing us to treat supervised learning as a mathematically tractable optimization problem with a cost function in terms of the membrane potential rather than the spike arrival time. A variant of stochastic gradient descent along with normalization has been used to derive the synaptic weight update rule. NormAD uses leaky integration of the input to determine the synaptic weight change. Since leaky integration is fundamental to all integrate-and-fire models of spiking neurons, we claim universal applicability of the learning rule to other models such as adaptive exponential integrate-and-fire model of neurons by demonstrating equally good performance in training with our algorithm.
Date of Conference: 12-17 July 2015
Date Added to IEEE Xplore: 01 October 2015
ISBN Information:

ISSN Information:

Conference Location: Killarney, Ireland

References

References is not available for this document.