skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Evolving Energy Efficient Convolutional Neural Networks

Conference ·

As deep neural networks have been deployed in more and more applications over the past half decade and are finding their way into an ever increasing number of operational systems, their energy consumption becomes a concern whether running in the datacenter or on edge devices. Hyperparameter optimization and automated network design for deep learning is a quickly growing field, but much of the focus has remained only on optimizing for the performance of the machine learning task. In this work, we demonstrate that the best performing networks created through this automated network design process have radically different computational characteristics (e.g. energy usage, model size, inference time), presenting the opportunity to utilize this optimization process to make deep learning networks more energy efficient and deployable to smaller devices. Optimizing for these computational characteristics is critical as the number of applications of deep learning continues to expand.

Research Organization:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR)
DOE Contract Number:
AC05-00OR22725
OSTI ID:
1606807
Resource Relation:
Conference: 2nd Workshop on Energy-Efficient Machine Learning and Big Data Analytics (in conjuction with IEEE Big Data) - Los Angeles, California, United States of America - 11/9/2019 5:00:00 AM-11/12/2019 5:00:00 AM
Country of Publication:
United States
Language:
English

References (17)

Auto-Keras: An Efficient Neural Architecture Search System conference July 2019
TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip journal October 2015
Evolving Deep Networks Using HPC conference January 2017
Progressive Neural Architecture Search book January 2018
Data-free Parameter Pruning for Deep Neural Networks conference January 2015
Energy Efficiency Enhancement for CNN-based Deep Mobile Sensing journal June 2019
Deep Residual Learning for Image Recognition conference June 2016
Mastering the game of Go with deep neural networks and tree search journal January 2016
Learning Transferable Architectures for Scalable Image Recognition conference June 2018
Regularized Evolution for Image Classifier Architecture Search journal July 2019
In-Datacenter Performance Analysis of a Tensor Processing Unit conference January 2017
ImageNet Large Scale Visual Recognition Challenge journal April 2015
On Global Electricity Usage of Communication Technology: Trends to 2030 journal April 2015
Learning Separable Filters conference June 2013
Optimizing deep learning hyper-parameters through an evolutionary algorithm
  • Young, Steven R.; Rose, Derek C.; Karnowski, Thomas P.
  • Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments - MLHPC '15 https://doi.org/10.1145/2834892.2834896
conference January 2015
Benchmarking Keyword Spotting Efficiency on Neuromorphic Hardware conference January 2019
CP-decomposition with Tensor Power Method for Convolutional Neural Networks compression conference February 2017

Similar Records

Digital Modeling on Large Kernel Metamaterial Neural Network
Journal Article · Wed Nov 01 00:00:00 EDT 2023 · Journal of Imaging Science and Technology · OSTI ID:1606807

IL-Net: Using Expert Knowledge to Guide the Design of Furcated Neural Networks
Conference · Thu Dec 13 00:00:00 EST 2018 · OSTI ID:1606807

Convolutional Neural Networks for the CHIPS Neutrino Detector R&D Project.
Thesis/Dissertation · Mon Jun 28 00:00:00 EDT 2021 · OSTI ID:1606807

Related Subjects