Loading [MathJax]/extensions/TeX/ieee_stixext.js
CAP: Communication-Aware Automated Parallelization for Deep Learning Inference on CMP Architectures | IEEE Journals & Magazine | IEEE Xplore

CAP: Communication-Aware Automated Parallelization for Deep Learning Inference on CMP Architectures


Abstract:

Real-time inference of deep learning models on embedded and energy-efficient devices becomes increasingly desirable with the rapid growth of artificial intelligence on ed...Show More

Abstract:

Real-time inference of deep learning models on embedded and energy-efficient devices becomes increasingly desirable with the rapid growth of artificial intelligence on edge. Specifically, to achieve superb energy-efficiency and scalability, efficient parallelization of single-pass deep neural network (DNN) inference on chip multiprocessor (CMP) architectures is urgently required by many time-sensitive applications. However, as the number of processing cores scales up and the performance of cores has grown much fast, the on-chip inter-core data movement is prone to be a performance bottleneck for computation. To remedy this problem and further improve the performance of network inference, in this work, we introduce a communication-aware DNN parallelization technique called CAP, by exploiting the elasticity and noise-tolerance of deep learning algorithms on CMP. Moreover, in the hope that the conducted studies can provide new design values for real-time neural network inference on embedded chips, we also have evaluated the proposed approach on both multi-core Neural Network Accelerators (NNA) chips and general-purpose chip-multiprocessors. Our experimental results show that the proposed CAP can achieve 1.12×-1.65× system speedups and 1.14×-2.70× energy efficiency for different neural networks while maintaining the inference accuracy, compared to baseline approaches.
Published in: IEEE Transactions on Computers ( Volume: 71, Issue: 7, 01 July 2022)
Page(s): 1626 - 1639
Date of Publication: 26 July 2021

ISSN Information:

Funding Agency:


References

References is not available for this document.