Processing math: 100%
Enabling High-Efficient ReRAM-Based CNN Training Via Exploiting Crossbar-Level Insignificant Writing Elimination | IEEE Journals & Magazine | IEEE Xplore

Enabling High-Efficient ReRAM-Based CNN Training Via Exploiting Crossbar-Level Insignificant Writing Elimination


Abstract:

Convolutional neural networks (CNNs) have been widely adopted in many deep learning applications. However, training a deep CNN requests intensive data transfer, which is ...Show More

Abstract:

Convolutional neural networks (CNNs) have been widely adopted in many deep learning applications. However, training a deep CNN requests intensive data transfer, which is both time and energy consuming. Using resistive random-access memory (ReRAM) to process data locally in memory is an emerging solution to eliminate the massive data movement. However, training cannot be efficiently supported with current ReRAM-based PIM accelerators because of the frequent and high-cost ReRAM writing operations from the delay, energy, and ReRAM lifetime perspectives. In this paper, we observe that activation induced and weight updating induced writing operations dominate the training energy on ReRAM-based accelerators. We then exploit and leverage a new angle in intermediate data (e.g., activations and errors) sparsity that fits the unique computation pattern in ReRAM crossbars to effectively eliminate the insignificant ReRAM writings, thus, enabling highly efficient CNN training without hurting the training accuracy. The experiment results show our proposed scheme achieves averagely 4.97\times (19.23\times) energy saving and 1.38\times (30.08\times) speedup compared to the state-of-the-art ReRAM-based accelerator (GPU). Our scheme also achieves 4.55\times lifetime enhancement compared to the state-of-the-art ReRAM accelerator.
Published in: IEEE Transactions on Computers ( Volume: 72, Issue: 11, November 2023)
Page(s): 3218 - 3230
Date of Publication: 29 June 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.