Loading [a11y]/accessibility-menu.js
Smart Battery Swapping Control for an Electric Motorcycle Fleet With Peak Time Based on Deep Reinforcement Learning | IEEE Journals & Magazine | IEEE Xplore

Smart Battery Swapping Control for an Electric Motorcycle Fleet With Peak Time Based on Deep Reinforcement Learning


Abstract:

This study proposes a deep Q-network (DQN) model for electric motorcycles (EMs) and a multi-agent reinforcement learning (MARL)-based central control system to support ba...Show More

Abstract:

This study proposes a deep Q-network (DQN) model for electric motorcycles (EMs) and a multi-agent reinforcement learning (MARL)-based central control system to support battery swapping decision-making in the delivery business. We aim to minimize expected delivery losses, especially in scenarios where delivery requests are randomly and independently generated for each EM, with fluctuating time distributions and limited BSS capacity. Our MARL benefits from a reservation mechanism and a profit-aggregated central system, which greatly reduces the complexity of MARL. Furthermore, to address the inherent non-stationary problems of MARL, we propose a decentralized agent-based MARL framework, named Decentralized Agents, Centralized Learning Deep Q Network. This framework, leveraging a tailored learning algorithm, achieves peak-averse behavior, reducing delivery losses. Additionally, we introduce a hybrid approach that combines the resulting DQN algorithm for determining when to visit the BSS, and a greedy algorithm for deciding which BSS to visit. Computational experiments using real-world delivery data are conducted to evaluate the performance of our algorithm. The results demonstrate that the hybrid approach maximizes the overall profit of the entire EM fleet in a challenging environment with limited BSS capacity.
Published in: IEEE Transactions on Intelligent Transportation Systems ( Volume: 25, Issue: 12, December 2024)
Page(s): 20175 - 20189
Date of Publication: 07 October 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.