Loading web-font TeX/Main/Regular
A 28-nm 36 Kb SRAM CIM Engine With 0.173 μm2 4T1T Cell and Self-Load-0 Weight Update for AI Inference and Training Applications | IEEE Journals & Magazine | IEEE Xplore

A 28-nm 36 Kb SRAM CIM Engine With 0.173 μm2 4T1T Cell and Self-Load-0 Weight Update for AI Inference and Training Applications


Abstract:

Computing-in-memory (CIM) promises high energy efficiency (EE) and performance in accelerating the feed-forward (FF) and back-propagation (BP) processes of deep neural ne...Show More

Abstract:

Computing-in-memory (CIM) promises high energy efficiency (EE) and performance in accelerating the feed-forward (FF) and back-propagation (BP) processes of deep neural networks (DNNs) with less data movement and high parallelism. However, challenges still lie in large memory cells, network mapping, and IR-drop variation to realize efficient CIM implementation. In this work, a 28-nm 36 Kb static random-access memory (SRAM) CIM engine with nondestructive-read (NDR) cell and weight update energy saving is used for multiply-accumulate (MAC) acceleration in artificial intelligence (AI) inference and train applications. A 4T1T SRAM bit-cell is proposed with NDR and records the smallest cell size of 0.173~\mu m2. The power-ON self-load-0 feature of the 4T1T cell saves the weight update energy and latency for writing 0. The shared-path dual-mode read (SPDMR) brings fewer circuit overheads to support both FF and BP paths. The bit-interleaving weight mapping (BIWM) speeds up the BP path without slowing FF. IR-drop-aware adaptive clampers (IRDAA-Cs) with hierarchical read word-lines (RWLs) and read bit-lines (RBLs) apply possibly accurate voltages on near/far cells. The engine achieves an EE of 263.1/412.1 TOPS/W, as well as an area efficiency (AE) of 2.5/4.9 TOPS/mm2 for FF/BP process @1-bit weight/activation with 74.4%–78.3% reduction in weight update energy.
Published in: IEEE Journal of Solid-State Circuits ( Volume: 59, Issue: 10, October 2024)
Page(s): 3277 - 3289
Date of Publication: 21 May 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.