MERSIT: A Hardware-Efficient 8-bit Data Format with Enhanced Post-Training Quantization DNN Accuracy
Abstract
References
Index Terms
- MERSIT: A Hardware-Efficient 8-bit Data Format with Enhanced Post-Training Quantization DNN Accuracy
Recommendations
Novel adaptive quantization methodology for 8-bit floating-point DNN training
AbstractThere is a high energy cost associated with training Deep Neural Networks (DNNs). Off-chip memory access contributes a major portion to the overall energy consumption. Reduction in the number of off-chip memory transactions can be achieved by ...
Novel 8-bit reversible full adder/subtractor using a QCA reversible gate
Conventional digital circuits consume a considerable amount of energy. If bits of information remain during logical operations, power consumption decreases considerably because the data bits in reversible computations are not lost. The types of ...
Deep Nibble: A 4-bit Number Format for Efficient DNN Training and Inference in FPGA
2024 37th SBC/SBMicro/IEEE Symposium on Integrated Circuits and Systems Design (SBCCI)This paper introduces a compact number format (Deep Nibble) and a resource- and performance-efficient dot product core (Deep Nibble Unit – DNU) designed to address the performance and memory bottlenecks of deep learning on resource-constrained ...
Comments
Information & Contributors
Information
Published In
Sponsors
In-Cooperation
Publisher
Association for Computing Machinery
New York, NY, United States
Publication History
Check for updates
Qualifiers
- Research-article
Funding Sources
- National Research Foundation of Korea (NRF)
- Institute of Information and Communications Technology Planning and Evaluation (IITP)
Conference
Acceptance Rates
Upcoming Conference
- Sponsor:
- sigda
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 204Total Downloads
- Downloads (Last 12 months)204
- Downloads (Last 6 weeks)102
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in