Loading [a11y]/accessibility-menu.js
DACO: Pursuing Ultra-low Power Consumption via DNN-Adaptive CPU-GPU CO-optimization on Mobile Devices | IEEE Conference Publication | IEEE Xplore

DACO: Pursuing Ultra-low Power Consumption via DNN-Adaptive CPU-GPU CO-optimization on Mobile Devices


Abstract:

As Deep Neural Networks (DNNs) become popular in mobile systems, their high computational and memory demands make them major power consumers, especially in limited-budget...Show More

Abstract:

As Deep Neural Networks (DNNs) become popular in mobile systems, their high computational and memory demands make them major power consumers, especially in limited-budget scenarios. In this paper, we propose DACO, a DNN-Adaptive CPU-GPU CO-optimization technique, to reduce the power consumption of DNNs. First, a resource-oriented classifier is proposed to quantify the computation/memory intensity of DNN models and classify them accordingly. Second, a set of rule-based policies is deduced for achieving the best-suited CPU-GPU system configuration in a coarse-grained manner. Combined with all the rules, a coarse-to-fine CPU-GPU auto-tuning approach is proposed to reach the Pareto-optimal speed and power consumption in DNN inference. Experimental results demonstrate that, compared with the existing approach, DACO could reduce power consumption by up to 71.9% while keeping an excellent DNN inference speed.
Date of Conference: 25-27 March 2024
Date Added to IEEE Xplore: 10 June 2024
ISBN Information:

ISSN Information:

Conference Location: Valencia, Spain

Contact IEEE to Subscribe

References

References is not available for this document.