Loading [a11y]/accessibility-menu.js
MoEI: Mobility-Aware Edge Inference Based on Model Partition and Service Migration | IEEE Journals & Magazine | IEEE Xplore

MoEI: Mobility-Aware Edge Inference Based on Model Partition and Service Migration

Publisher: IEEE

Abstract:

Deep neural networks are the cornerstone of many mobile intelligent systems, and their inference processes bring about computation-intensive tasks. Device-edge cooperativ...View more

Abstract:

Deep neural networks are the cornerstone of many mobile intelligent systems, and their inference processes bring about computation-intensive tasks. Device-edge cooperative inference in mobile edge computing provides a fine-grained processing method to migrate the burden of inference computation. However, the geographical dispersion of resources and the mobility pattern of devices pose scheduling issues to be considered. In this paper, we propose a task scheduling framework for such device-edge systems to improve the pipeline time of model inference. First, we consider the resource provisioning strategy with a pre-fetching service migration setting in the environment of multiple mobile devices and edge nodes. Then, we leverage game theory to analyze the property of the decision-making process and propose an offline algorithm under complete information. Next, we propose an algorithm based on proximal policy optimization to enable mobile devices to make decisions in a distributed online manner. Further, we adopt a memory mechanism into the online algorithm to improve the decision-makers’ understanding of the system environment. Experiments demonstrate the effectiveness of the two algorithms. The average pipeline time of the proposed online algorithm is only 61.44% of that of local processing, which is 1.196 times that of the proposed offline algorithm.
Published in: IEEE Transactions on Mobile Computing ( Volume: 23, Issue: 10, October 2024)
Page(s): 9437 - 9450
Date of Publication: 15 February 2024

ISSN Information:

Publisher: IEEE

Funding Agency:


References

References is not available for this document.