An Edge 3D CNN Accelerator for Low-Power Activity Recognition | IEEE Journals & Magazine | IEEE Xplore

An Edge 3D CNN Accelerator for Low-Power Activity Recognition


Abstract:

3D convolutional neural networks (CNNs) are gaining increasing popularity in the area of video-based action/activity analysis. Compared to 2D convolutions that share the ...Show More

Abstract:

3D convolutional neural networks (CNNs) are gaining increasing popularity in the area of video-based action/activity analysis. Compared to 2D convolutions that share the filters in a 2D spatial domain, 3D convolutions further reuse filters in the temporal dimension to capture temporal-domain features in the video. How to exploit the data locality in the temporal dimension directly impacts the energy efficiency of specialized architectures for 3D CNN inference. Prior works on specialized 3D-CNN accelerators employ additional on-chip memories and multicluster architecture to reuse data among the process element (PE) arrays, which is very expensive for low-power chip implementation. Instead of harvesting in-memory data locality, we propose the architecture of systolic cube to exploit the spatial and temporal localities in 3D CNNs, which moves the reusable data in-between PEs connected via a 3D-cube network-on-chip. Furthermore, due to the existence of visual feature reappearance in the temporal domain, there exists a considerable portion of repetitive pixels and activations among the feature maps captured at adjacent time slots. To eliminate such temporal redundancy in 3D CNNs, the proposed accelerator architecture is equipped with a redundancy detection and elimination mechanism, capable of skipping the computations with the same activations and parameters when reusing the convolutional filters along the temporal dimension. In our evaluation, the experimental results show that the systolic-cube architecture contributes to a considerable energy-efficiency boost for state-of-the-art activity-recognition benchmarks and datasets.
Page(s): 918 - 930
Date of Publication: 21 July 2020

ISSN Information:

Funding Agency:


References

References is not available for this document.