Abstract:
In mobile edge computing (MEC) systems, unmanned aerial vehicles (UAVs) facilitate edge service providers (ESPs) offering flexible resource provisioning with broader comm...Show MoreMetadata
Abstract:
In mobile edge computing (MEC) systems, unmanned aerial vehicles (UAVs) facilitate edge service providers (ESPs) offering flexible resource provisioning with broader communication coverage and thus improving the Quality of Service (QoS). However, dynamic system states and various traffic patterns seriously hinder efficient cooperation among UAVs. Existing solutions commonly rely on prior system knowledge or complex neural network models, lacking adaptability and causing excessive overheads. To address these critical challenges, we propose the DisOff, a novel profit-aware cooperative offloading framework in UAV-enabled MEC with lightweight deep reinforcement learning (DRL). First, we design an improved DRL with twin critic-networks and delay mechanism, which solves the Q -value overestimation and high variance and thus approximates the optimal UAV cooperative offloading and resource allocation. Next, we develop a new multiteacher distillation mechanism for the proposed DRL model, where the policies of multiple UAVs are integrated into one DRL agent, compressing the model size while maintaining superior performance. Using the real-world datasets of user traffic, extensive experiments are conducted to validate the effectiveness of the proposed DisOff. Compared to benchmark methods, the DisOff enhances ESP profits while reducing the DRL model size and training costs.
Published in: IEEE Internet of Things Journal ( Volume: 11, Issue: 12, 15 June 2024)