Loading [MathJax]/extensions/MathMenu.js
Joint Task Offloading and Resource Allocation for Quality-Aware Edge-Assisted Machine Learning Task Inference | IEEE Journals & Magazine | IEEE Xplore

Joint Task Offloading and Resource Allocation for Quality-Aware Edge-Assisted Machine Learning Task Inference


Abstract:

Edge computing is essential to enhance delay-sensitive and computation-intensive machine learning (ML) task inference services. Quality of inference results, which is mai...Show More

Abstract:

Edge computing is essential to enhance delay-sensitive and computation-intensive machine learning (ML) task inference services. Quality of inference results, which is mainly impacted by the task data and ML models, is an important indicator impacting the system performance. In this paper, we consider a quality-aware edge-assisted ML task inference scenario and propose a resource management scheme to minimize the total task processing delay while guaranteeing the stability of all the task queues and the inference accuracy requirements of all the tasks. In our scheme, the task offloading, task data adjustment, computing resource allocation, and wireless channel allocation are jointly optimized. The Lyapunov optimization technique is adopted to transform the original optimization problem into a deterministic problem for each time slot. Considering the high complexity of the optimization problem, we design an algorithm that decomposes the problem into a task offloading and channel allocation (TOCA) sub-problem, a task data adjustment sub-problem, and a computing resource allocation sub-problem, and then solves them iteratively. A low-complexity heuristic algorithm is also designed to solve the TOCA sub-problem efficiently. Extensive simulations are conducted by varying different crucial parameters. The results demonstrate the superiority of our scheme in comparison with 4 other schemes.
Published in: IEEE Transactions on Vehicular Technology ( Volume: 72, Issue: 5, May 2023)
Page(s): 6739 - 6752
Date of Publication: 10 January 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.