skip to main content
10.1145/3546790.3546814acmotherconferencesArticle/Chapter ViewAbstractPublication PagesiconsConference Proceedingsconference-collections
research-article
Public Access

Think Fast: Time Control in Varying Paradigms of Spiking Neural Networks

Published: 07 September 2022 Publication History

Abstract

The state-of-the-art in machine learning has been achieved primarily by deep learning artificial neural networks. These networks are powerful but biologically implausible and energy intensive. In parallel, a new paradigm of neural network is being researched that can alleviate some of the computational and energy issues. These networks, spiking neural networks (SNNs), have transformative potential if the community is able to bridge the gap between deep learning and SNNs. However, SNNs are notoriously difficult to train and lack precision in their communication. In an effort to overcome these limitations and retain the benefits of the learning process in deep learning, we investigate novel ways to translate between them. We construct several network designs with varying degrees of biological plausibility. We then test our designs on an image classification task and demonstrate our designs allow for a customized tradeoff between biological plausibility, power efficiency, inference time, and accuracy.

References

[1]
Byungik Ahn. 2014. Computation of deep belief networks using special-purpose hardware architecture. In 2014 International Joint Conference on Neural Networks (IJCNN). 141–148. https://doi.org/10.1109/IJCNN.2014.6889903
[2]
Filipp Akopyan, Jun Sawada, Andrew Cassidy, Rodrigo Alvarez-Icaza, John Arthur, Paul Merolla, Nabil Imam, Yutaka Nakamura, Pallab Datta, Gi-Joon Nam, Brian Taba, Michael Beakes, Bernard Brezzo, Jente B. Kuang, Rajit Manohar, William P. Risk, Bryan Jackson, and Dharmendra S. Modha. 2015. TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 34, 10(2015), 1537–1557. https://doi.org/10.1109/TCAD.2015.2474396
[3]
Lasse F Wolff Anthony, Benjamin Kanding, and Raghavendra Selvan. 2020. Carbontracker: Tracking and predicting the carbon footprint of training deep learning models. arXiv preprint arXiv:2007.03051(2020).
[4]
Rui Araújo, Nicolai Waniek, and Jorg Conradt. 2014. Development of a Dynamically Extendable SpiNNaker Chip Computing Module. 821–828. https://doi.org/10.1007/978-3-319-11179-7_103
[5]
Trevor Bekolay, James Bergstra, Eric Hunsberger, Travis DeWolf, Terrence Stewart, Daniel Rasmussen, Xuan Choo, Aaron Voelker, and Chris Eliasmith. 2014. Nengo: a Python tool for building large-scale functional brain models. Frontiers in Neuroinformatics 7, 48 (2014), 1–13. https://doi.org/10.3389/fninf.2013.00048
[6]
Tom Brown, Benjamin Mann, Nick Ryder, Melanie Subbiah, Jared D Kaplan, Prafulla Dhariwal, Arvind Neelakantan, Pranav Shyam, Girish Sastry, Amanda Askell, Sandhini Agarwal, Ariel Herbert-Voss, Gretchen Krueger, Tom Henighan, Rewon Child, Aditya Ramesh, Daniel Ziegler, Jeffrey Wu, Clemens Winter, Chris Hesse, Mark Chen, Eric Sigler, Mateusz Litwin, Scott Gray, Benjamin Chess, Jack Clark, Christopher Berner, Sam McCandlish, Alec Radford, Ilya Sutskever, and Dario Amodei. 2020. Language Models are Few-Shot Learners. In Advances in Neural Information Processing Systems, H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin (Eds.). Vol. 33. Curran Associates, Inc., 1877–1901. https://proceedings.neurips.cc/paper/2020/file/1457c0d6bfcb4967418bfb8ac142f64a-Paper.pdf
[7]
Anthony N. Burkitt. 2006. A review of the integrate-and-fire neuron model: I. Homogeneous synaptic input.Biological cybernetics 95, 1 (2006), 1–19. https://doi.org/10.1007/s00422-006-0068-6
[8]
Yongqiang Cao, Yang Chen, and Deepak Khosla. 2015. Spiking deep convolutional neural networks for energy-efficient object recognition. International Journal of Computer Vision 113, 1 (2015), 54–66.
[9]
John Carter, Jocelyn Rego, Daniel Schwartz, Vikas Bhandawat, and Edward Kim. 2020. Learning Spiking Neural Network Models of Drosophila Olfaction. In International Conference on Neuromorphic Systems 2020. 1–5.
[10]
Nimet Dahasert, İsmail Öztürk, and Recai Kiliç. 2012. Implementation of Izhikevich neuron model with field programmable devices. In 2012 20th Signal Processing and Communications Applications Conference (SIU). 1–4. https://doi.org/10.1109/SIU.2012.6204544
[11]
Mike Davies, Narayan Srinivasa, Tsung-Han Lin, Gautham Chinya, Yongqiang Cao, Sri Harsha Choday, Georgios Dimou, Prasad Joshi, Nabil Imam, Shweta Jain, Yuyun Liao, Chit-Kwan Lin, Andrew Lines, Ruokun Liu, Deepak Mathaikutty, Steven McCoy, Arnab Paul, Jonathan Tse, Guruguhanathan Venkataramanan, Yi-Hsin Weng, Andreas Wild, Yoonseok Yang, and Hong Wang. 2018. Loihi: A Neuromorphic Manycore Processor with On-Chip Learning. IEEE Micro 38, 1 (2018), 82–99. https://doi.org/10.1109/MM.2018.112130359
[12]
Mike Davies, Andreas Wild, Garrick Orchard, Yulia Sandamirskaya, Gabriel A. Fonseca Guerra, Prasad Joshi, Philipp Plank, and Sumedh R. Risbud. 2021. Advancing Neuromorphic Computing With Loihi: A Survey of Results and Outlook. Proc. IEEE 109, 5 (2021), 911–934. https://doi.org/10.1109/JPROC.2021.3067593
[13]
Peter U Diehl, Daniel Neil, Jonathan Binas, Matthew Cook, Shih-Chii Liu, and Michael Pfeiffer. 2015. Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing. In 2015 International joint conference on neural networks (IJCNN). IEEE, 1–8.
[14]
Bing Han, Gopalakrishnan Srinivasan, and Kaushik Roy. 2020. RMP-SNN: Residual Membrane Potential Neuron for Enabling Deeper High-Accuracy and Low-Latency Spiking Neural Network. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (June 2020).
[15]
H. Hikawa. 2003. A digital hardware pulse-mode neuron with piecewise linear activation function. IEEE Transactions on Neural Networks 14, 5 (2003), 1028–1037. https://doi.org/10.1109/TNN.2003.816058
[16]
Geoffrey Hinton, Li Deng, Dong Yu, George E. Dahl, Abdel-rahman Mohamed, Navdeep Jaitly, Andrew Senior, Vincent Vanhoucke, Patrick Nguyen, Tara N. Sainath, and Brian Kingsbury. 2012. Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups. IEEE Signal Processing Magazine 29, 6 (2012), 82–97. https://doi.org/10.1109/MSP.2012.2205597
[17]
A. L. Hodgkin and A. F. Huxley. 1990. A quantitative description of membrane current and its application to conduction and excitation in nerve.Bltn Mathcal Biology 52(1990), 25–71. https://doi.org/10.1007/BF02459568
[18]
Bernd Illing, Wulfram Gerstner, and Johanni Brea. 2019. Biologically plausible deep learning — But how far can we go with shallow networks?Neural Networks 118(2019), 90–101. https://doi.org/10.1016/j.neunet.2019.06.001
[19]
Edward Kim, Jessica Yarnall, Priya Shah, and Garrett T Kenyon. 2019. A Neuromorphic Sparse Coding Defense to Adversarial Images. In Proceedings of the International Conference on Neuromorphic Systems. 1–8.
[20]
C. Koch and I. Segev. 1998. Methods in Neuronal Modeling: from Ions to Networks, 2nd Edition. (1998).
[21]
Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. 2012. ImageNet Classification with Deep Convolutional Neural Networks. In Advances in Neural Information Processing Systems, Vol. 25. https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf
[22]
Louis Lapicque. 1907. Recherches quantitatives sur l’excitation electrique des nerfs traitee comme une polarization.Journal of Physiol Pathol Générale 9 (1907), 620–635.
[23]
Y. Lecun, L. Bottou, Y. Bengio, and P. Haffner. 1998. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324. https://doi.org/10.1109/5.726791
[24]
Warren S. McCulloch and Walter Pitts. 1943. A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics 5, 4 (1943), 115–133.
[25]
Maxim Milakov. 2014. Deep Learning With GPUs. nvidia.co.uk/docs/IO/147844/Deep-Learning-With-GPUs-MaximMilakov-NVIDIA.pdf
[26]
Marvin Minsky and Seymour Papert. 1969. Perceptrons.(1969).
[27]
Evgeny Ponomarev, Sergey Matveev, Ivan Oseledets, and Valery Glukhov. 2021. Latency Estimation Tool and Investigation of Neural Networks Inference on Mobile GPU. Computers 10 (08 2021), 104. https://doi.org/10.3390/computers10080104
[28]
Frank Rosenblatt. 1958. The perceptron: a probabilistic model for information storage and organization in the brain.Psychological review 65, 6 (1958), 386.
[29]
Kaushik Roy, Akhilesh Jaiswal, and Priyadarshini Panda. 2019. Towards spike-based machine intelligence with neuromorphic computing. Nature 575, 7784 (2019), 607–617.
[30]
Bodo Rueckauer, Iulia-Alexandra Lungu, Yuhuang Hu, Michael Pfeiffer, and Shih-Chii Liu. 2017. Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification. Frontiers in Neuroscience 11 (2017). https://doi.org/10.3389/fnins.2017.00682
[31]
Catherine D. Schuman, Thomas E. Potok, Robert M. Patton, J. Douglas Birdwell, Mark E. Dean, Garrett S. Rose, and James S. Plank. 2017. A Survey of Neuromorphic Computing and Neural Networks in Hardware. https://doi.org/10.48550/ARXIV.1705.06963
[32]
Shy Shoham, Daniel H O’Connor, and Ronen Segev. 2006. How silent is the brain: is there a “dark matter” problem in neuroscience?Journal of Comparative Physiology A 192, 8 (2006), 777–784.
[33]
David Silver, Aja Huang, Chris J. Maddison, Arthur Guez, Laurent Sifre, George van den Driessche, Julian Schrittwieser, Ioannis Antonoglou, Veda Panneershelvam, Marc Lanctot, Sander Dieleman, Dominik, Grewe, John Nham, Nal Kalchbrenner, Ilya Sutskever, Timothy Lillicrap, Madeleine Leach, Koray Kavukcuoglu, Thore Graepel, and Demis Hassabis. 2016. Mastering the game of Go with deep neural networks and tree search.Nature 529(2016), 484–489. https://doi.org/10.1038/nature16961
[34]
Martino Sorbaro, Qian Liu, Massimo Bortone, and Sadique Sheik. 2020. Optimizing the Energy Consumption of Spiking Neural Networks for Neuromorphic Applications. Frontiers in Neuroscience 14 (2020). https://doi.org/10.3389/fnins.2020.00662
[35]
Terrence C. Stewart. 2012. A Technical Overview of the Neural Engineering Framework. Technical Report. Centre for Theoretical Neuroscience.
[36]
Emma Strubell, Ananya Ganesh, and Andrew McCallum. 2020. Energy and Policy Considerations for Modern Deep Learning Research. Proceedings of the AAAI Conference on Artificial Intelligence 34, 09 (Apr. 2020), 13693–13696. https://doi.org/10.1609/aaai.v34i09.7123
[37]
Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, and Andrew Rabinovich. 2015. Going deeper with convolutions. In 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 1–9. https://doi.org/10.1109/CVPR.2015.7298594
[38]
Aaron R. Voelker, Daniel Rasmussen, and Chris Eliasmith. 2020. A Spike in Performance: Training Hybrid-Spiking Neural Networks with Quantized Activation Functions. CoRR abs/2002.03553(2020).
[39]
Sally Ward-Foxton. 2021. Intel offers Loihi 2 neuromorphic chip and software framework. embedded.com/intel-offers-loihi-2-neuromorphic-chip-and-software-framework/
[40]
Yijing Watkins, Edward Kim, Andrew Sornborger, and Garrett T Kenyon. 2020. Using sinusoidally-modulated noise as a surrogate for slow-wave sleep to accomplish stable unsupervised dictionary learning in a spike-based sparse coding model. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops. 360–361.
[41]
L. Zhang, S. Zhou, T. Zhi, Z. Du, and Y. Chen. 2019. TDSNN: From Deep Neural Networks to Deep Spike Neural Networks with Temporal-Coding. Proceedings of the AAAI Conference on Artificial Intelligence 33, 1, 1319–1326. https://doi.org/10.1609/aaai.v33i01.33011319

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
ICONS '22: Proceedings of the International Conference on Neuromorphic Systems 2022
July 2022
213 pages
ISBN:9781450397896
DOI:10.1145/3546790
© 2022 Association for Computing Machinery. ACM acknowledges that this contribution was authored or co-authored by an employee, contractor or affiliate of the United States government. As such, the United States Government retains a nonexclusive, royalty-free right to publish or reproduce this article, or to allow others to do so, for Government purposes only.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 September 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. machine learning
  2. neuro-inspired artificial intelligence
  3. neuromorphic computing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

ICONS

Acceptance Rates

Overall Acceptance Rate 13 of 22 submissions, 59%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 199
    Total Downloads
  • Downloads (Last 12 months)122
  • Downloads (Last 6 weeks)23
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Login options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media