ABSTRACT
Online learning in neural networks has the potential to transform AI research. By enabling new information to be assimilated into existing systems, platforms can be adaptive to unseen data and can personalise performance to an individual. A common approach in providing AI to a user is to send queries to a remote cloud service which processes the information and sends back a response. Neuromorphic hardware offers an alternate solution by providing a dedicated computing platform from which neural networks can be run locally and efficiently. This work explores the potential of the SpiNNaker neuromorphic hardware to run the eligibility propagation (e-prop) algorithm on chip whilst learning online in real time.
- Guillaume Bellec, Darjan Salaj, Anand Subramoney, Robert Legenstein, and Wolfgang Maass. 2018. Long short-term memory and learning-to-learn in networks of spiking neurons. (2018). arxiv:1803.09574http://arxiv.org/abs/1803.09574Google Scholar
- Guillaume Bellec, Franz Scherr, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass. 2019. Biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets. arXiv preprint (2019), 1–34. arxiv:2553450 [arXiv:submit]Google Scholar
- Guillaume Bellec, Franz Scherr, Anand Subramoney, Elias Hajek, Darjan Salaj, Robert Legenstein, and Wolfgang Maass. 2020. A solution to the learning dilemma for recurrent networks of spiking neurons. Nature communications 11, 1 (2020), 1–15.Google Scholar
- Petruţ A. Bogdan, Beatrice Marcinnò, Claudia Casellato, Stefano Casali, Andrew G.D. Rowley, Michael Hopkins, Francesco Leporati, Egidio D’Angelo, and Oliver Rhodes. 2021. Towards a Bio-Inspired Real-Time Neuromorphic Cerebellum. Frontiers in Cellular Neuroscience 15 (2021), 130. https://doi.org/10.3389/fncel.2021.622870Google ScholarCross Ref
- I. M. Comsa, T. Fischbacher, K. Potempa, A. Gesmundo, L. Versari, and J. Alakuijala. 2020. Temporal Coding in Spiking Neural Networks with Alpha Synaptic Function. In ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 8529–8533.Google Scholar
- Andrew P Davison, Daniel Brüderle, Jochen Eppler, Jens Kremkow, Eilif Muller, Dejan Pecevski, Laurent Perrinet, and Pierre Yger. 2008. PyNN: a common interface for neuronal network simulators. Frontiers in Neuroinformatics 2, January (2008), 11. https://doi.org/10.3389/neuro.11.011.2008Google Scholar
- Xavier Glorot and Yoshua Bengio. 2010. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the thirteenth international conference on artificial intelligence and statistics. JMLR Workshop and Conference Proceedings, 249–256.Google Scholar
- Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-term Memory. Neural computation 9 (12 1997), 1735–80. https://doi.org/10.1162/neco.1997.9.8.1735Google ScholarDigital Library
- Dongsung Huh and Terrence J Sejnowski. 2018. Gradient descent for spiking neural networks. (2018), 1433–1443.Google Scholar
- Jacques Kaiser, Hesham Mostafa, and Emre Neftci. 2018. Synaptic plasticity dynamics for deep continuous local learning. arXiv preprint arXiv:1811.10766(2018).Google Scholar
- Chankyu Lee, Syed Shakib Sarwar, and Kaushik Roy. 2019. Enabling Spike-based Backpropagation in State-of-the-art Deep Neural Network Architectures. CoRR abs/1903.06379(2019). arxiv:1903.06379http://arxiv.org/abs/1903.06379Google Scholar
- Jun Haeng Lee, Tobi Delbruck, and Michael Pfeiffer. 2016. Training Deep Spiking Neural Networks Using Backpropagation. Frontiers in Neuroscience 10 (2016), 508. https://doi.org/10.3389/fnins.2016.00508Google ScholarCross Ref
- Timothy P. Lillicrap, Daniel Cownden, Douglas B. Tweed, and Colin J. Akerman. 2016. Random synaptic feedback weights support error backpropagation for deep learning. Nature Communications 7(2016), 1–10. https://doi.org/10.1038/ncomms13276 arxiv:arXiv:1411.0247v1Google ScholarCross Ref
- Wolfgang Maass. 1997. Networks of spiking neurons: The third generation of neural network models. Neural Networks 10, 9 (1997), 1659–1671. https://doi.org/10.1016/S0893-6080(97)00011-7Google ScholarCross Ref
- Carver Mead. 1989. Analog VLSI and Neural Systems. Addison-Wesley Longman Publishing Co., Inc.Google Scholar
- Ari S Morcos and Christopher D Harvey. 2016. History-dependent variability in population dynamics during evidence accumulation in cortex. Nature neuroscience 19, 12 (2016), 1672–1681.Google Scholar
- H. Mostafa. 2018. Supervised Learning Based on Temporal Coding in Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 29, 7 (July 2018), 3227–3235. https://doi.org/10.1109/TNNLS.2017.2726060Google Scholar
- Javier Navaridas, Mikel Luján, Luis A Plana, Steve Temple, and Steve Byram Furber. 2015. SpiNNaker: Enhanced multicast routing. Parallel Comput. 45(2015), 49–66. https://doi.org/10.1016/j.parco.2015.01.002Google ScholarDigital Library
- Emre O Neftci, Hesham Mostafa, and Friedemann Zenke. 2019. Surrogate gradient learning in spiking neural networks. arXiv preprint arXiv:1901.09948(2019).Google Scholar
- Eustace Painkras, Luis A Plana, Jim Garside, Steve Temple, Francesco Galluppi, Cameron Patterson, David Roland Lester, Andrew D Brown, and Steve Byram Furber. 2013. SpiNNaker: A 1-W 18-core system-on-chip for massively-parallel neural network simulation. IEEE Journal of Solid-State Circuits 48, 8 (2013), 1943–1953. https://doi.org/10.1109/JSSC.2013.2259038Google ScholarCross Ref
- Oliver Rhodes, Petruţ A. Bogdan, Christian Brenninkmeijer, Simon Davidson, Donal Fellows, Andrew Gait, David R. Lester, Mantas Mikaitis, Luis A. Plana, Andrew G. D. Rowley, Alan B. Stokes, and Steve B. Furber. 2018. sPyNNaker: A Software Package for Running PyNN Simulations on SpiNNaker. Frontiers in Neuroscience 12, November (2018). https://doi.org/10.3389/fnins.2018.00816Google Scholar
- Oliver Rhodes, Luca Peres, Andrew G.D. D. Rowley, Andrew Gait, Luis A. Plana, Christian Brenninkmeijer, and Steve B. Furber. 2020. Real-time cortical simulation on neuromorphic hardware. Phil. Trans. R. Soc. A 378 (2020), 20190160. https://doi.org/10.1098/rsta.2019.0160Google ScholarCross Ref
- Andrew G. D. Rowley, Christian Brenninkmeijer, Simon Davidson, Donal Fellows, Andrew Gait, David R. Lester, Luis A. Plana, Oliver Rhodes, Alan B. Stokes, and Steve B. Furber. 2019. SpiNNTools: The Execution Engine for the SpiNNaker Platform. Frontiers in Neuroscience 13 (2019), 231. https://doi.org/10.3389/fnins.2019.00231Google ScholarCross Ref
- Sumit Bam Shrestha and Garrick Orchard. 2018. SLAYER: Spike Layer Error Reassignment in Time. (2018), 1412–1421. http://papers.nips.cc/paper/7415-slayer-spike-layer-error-reassignment-in-time.pdfGoogle Scholar
- Timo C. Wunderlich and Christian Pehle. 2020. EventProp: Backpropagation for Exact Gradients in Spiking Neural Networks. (2020). arxiv:2009.08378 [q-bio.NC]Google Scholar
- Friedemann Zenke and Surya Ganguli. 2018. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks. Neural Computation 30, 6 (2018), 1514–1541. https://doi.org/10.1162/neco_a_01086 arXiv:https://doi.org/10.1162/neco_a_01086PMID: 29652587.Google ScholarDigital Library
Recommendations
An unsupervised neuromorphic clustering algorithm
Brains perform complex tasks using a fraction of the power that would be required to do the same on a conventional computer. New neuromorphic hardware systems are now becoming widely available that are intended to emulate the more power efficient, ...
Algorithm and Application Impacts of Programmable Plasticity in Spiking Neuromorphic Hardware
ICONS '23: Proceedings of the 2023 International Conference on Neuromorphic SystemsSynaptic plasticity has long been thought to be key to learning in both biological neural networks and artificial neural networks. There are a wide array of plasticity rules in biological neural systems, but neuromorphic computing has mainly focused ...
Sparse Vector Binding on Spiking Neuromorphic Hardware Using Synaptic Delays
ICONS '22: Proceedings of the International Conference on Neuromorphic Systems 2022Vector Symbolic Architectures (VSA) were first proposed as connectionist models for symbolic reasoning, leveraging parallel and in-memory computing in brains and neuromorphic hardware that enable low-power, low-latency applications. Symbols are defined ...
Comments