Loading [a11y]/accessibility-menu.js
Accelerating DNN Inference With Reliability Guarantee in Vehicular Edge Computing | IEEE Journals & Magazine | IEEE Xplore

Accelerating DNN Inference With Reliability Guarantee in Vehicular Edge Computing


Abstract:

This paper explores on accelerating Deep Neural Network (DNN) inference with reliability guarantee in Vehicular Edge Computing (VEC) by considering the synergistic impact...Show More

Abstract:

This paper explores on accelerating Deep Neural Network (DNN) inference with reliability guarantee in Vehicular Edge Computing (VEC) by considering the synergistic impacts of vehicle mobility and Vehicle-to-Vehicle/Infrastructure (V2V/V2I) communications. First, we show the necessity of striking a balance between DNN inference acceleration and reliability in VEC, and give insights into the design rationale by analyzing the features of overlapped DNN partitioning and mobility-aware task offloading. Second, we formulate the Cooperative Partitioning and Offloading (CPO) problem by presenting a cooperative DNN partitioning and offloading scenario, followed by deriving an offloading reliability model and a DNN inference delay model. The CPO is proved as NP-hard. Third, we propose two approximation algorithms, i.e., Submodular Approximation Allocation Algorithm (SA3) and Feed Me the Rest algorithm (FMtR). In particular, SA3 determines the edge allocation in a centralized way, which achieves 1/3-optimal approximation on maximizing the inference reliability. On this basis, FMtR partitions the DNN models and offloads the tasks to the allocated edge nodes in a distributed way, which achieves 1/2-optimal approximation on maximizing the inference reliability. Finally, we build the simulation model and give a comprehensive performance evaluation, which demonstrates the superiority of the proposed solutions.
Published in: IEEE/ACM Transactions on Networking ( Volume: 31, Issue: 6, December 2023)
Page(s): 3238 - 3253
Date of Publication: 01 June 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.