Abstract:
With the rapid development of vehicle-to-everything communication technologies, many emerging compute-intensive in-vehicle applications have emerged. Vehicle edge computi...Show MoreMetadata
Abstract:
With the rapid development of vehicle-to-everything communication technologies, many emerging compute-intensive in-vehicle applications have emerged. Vehicle edge computing (VEC) leverages the computational resources available at edge nodes to alleviate the strain on public network transmission and reduce task processing latency. However, the dynamic nature of the vehicle environment, the challenge of incentivizing vehicles to share idle resources, and the uncertainty surrounding the number of resources shared by vehicles present significant obstacles in designing task offloading and resource allocation methods for VEC systems. In this paper, we propose a hybrid offloading model wherein task vehicles can offload tasks to roadside units (RSUs) or other vehicles sharing resources. To maximize the benefits derived from task vehicles, RSUs, and shared resource vehicles, we first introduce an adaptive type selection algorithm (ALTS) for shared resource vehicles based on the multi-armed bandit (MAB) theory. Furthermore, we model the three-party interaction as a multi-stage Stackelberg game involving a computational resource lease contract. Experimental results demonstrate the superiority of the proposed ALTS algorithm over existing learning algorithms, thereby showcasing the effectiveness of the lease contract and the three-party transaction mechanism. Comparative experiments also reveal that integrating RSUs and idle vehicle resources offers better services compared to mechanisms relying solely on edge servers or shared resource vehicles.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 25, Issue: 8, August 2024)