skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Energy Efficient Stochastic-Based Deep Spiking Neural Networks for Sparse Datasets

Conference ·

With large deep neural networks (DNNs) neces- sary to solve complex and data-intensive problems, energy efficiency is a key bottleneck for effectively deploying DL in the real world. Deep spiking NNs have gained much research at- tention recently due to the interest in building biological neural networks and the availability of neuromorphic platforms, which can be orders of magnitude more energy efficient compared to CPUs and GPUs. Although spiking NNs have proven to be an efficient technique for solving many machine learning and computer vision problems, to the best of our knowledge, this is the first attempt to adapt spiking NNs to sparse datasets. In this paper, we study the behaviour of spiking NNs in handling NLP datasets and the sparsity in their data representation. Then, we propose a novel framework for spiking NN using the concept of stochastic computing. Specifically, instead of generating spike trains with firing rates proportional to the intensity of each value in the feature set separately, the whole feature set is treated as a distribution function and a stochastic spiking train that follow this distribution is generated. This framework reduces the connectivity between NN layers from O(N) to O(logN). Also, it encodes input data differently and make suitable to handle sparse datasets. Finally, the framework achieves high energy efficiency since it uses Integrate and Fire neurons same as conventional spiking NNs. The results show that our proposed stochastic-based SNN achieves nearly the same accuracy as the original DNN on MNIST dataset, and it has better performance than state- of-the-art SNN. Besides that stochastic-based SNN is energy efficient, where the fully connected DNN, the conventional SNN, and the data normalized SNN consume 38.24, 1.83, and 1.85 -times more energy than the stochastic-based SNN, respectively. For sparse datasets, including IMDb and In-House clinical datasets, stochastic-based SNN achieves performance comparable to that of the conventional DNN. However, the conventional spiking NN has a significant decline in classification accuracy.

Research Organization:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC05-00OR22725
OSTI ID:
1422999
Resource Relation:
Conference: IEEE Big Data - Boston, Massachusetts, United States of America - 12/11/2017 10:00:00 AM-12/14/2017 5:00:00 AM
Country of Publication:
United States
Language:
English

Similar Records

Bayesian sparse learning with preconditioned stochastic gradient MCMC and its applications
Journal Article · Wed Feb 03 00:00:00 EST 2021 · Journal of Computational Physics · OSTI ID:1422999

Neurogenesis Deep Learning: Extending deep networks to accommodate new classes
Technical Report · Thu Dec 01 00:00:00 EST 2016 · OSTI ID:1422999

Final Scientific/Technical Report
Technical Report · Thu Jan 18 00:00:00 EST 2024 · OSTI ID:1422999

Related Subjects