Resource Allocation for Metaverse Experience Optimization: A Multi-Objective Multi-Agent Evolutionary Reinforcement Learning Approach | IEEE Journals & Magazine | IEEE Xplore

Resource Allocation for Metaverse Experience Optimization: A Multi-Objective Multi-Agent Evolutionary Reinforcement Learning Approach


Abstract:

In the Metaverse, real-time, concurrent services such as virtual classrooms and immersive gaming require local graphic rendering to maintain low latency. However, the lim...Show More

Abstract:

In the Metaverse, real-time, concurrent services such as virtual classrooms and immersive gaming require local graphic rendering to maintain low latency. However, the limited processing power and battery capacity of user devices make it challenging to balance Quality of Experience (QoE) and terminal energy consumption. In this paper, we investigate a multi-objective optimization problem (MOP) regarding power control and rendering capacity allocation by formulating it as a multi-objective optimization problem. This problem aims to minimize energy consumption while maximizing Meta-Immersion (MI), a metric that integrates objective network performance with subjective user perception. To solve this problem, we propose a Multi-Objective Multi-Agent Evolutionary Reinforcement Learning with User-Object-Attention (M2ERL-UOA) algorithm. The algorithm employs a prediction-driven evolutionary learning mechanism for multi-agents, coupled with optimized rendering capacity decisions for virtual objects. The algorithm can yield a superior Pareto front that attains the Nash equilibrium. Simulation results demonstrate that the proposed algorithm can generate Pareto fronts, effectively adapts to dynamic user preferences, and significantly reduces decision-making time compared to several benchmarks.
Published in: IEEE Transactions on Mobile Computing ( Volume: 24, Issue: 4, April 2025)
Page(s): 3473 - 3488
Date of Publication: 02 December 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.