Skip to main content
Log in

CASA: cost-effective EV charging scheduling based on deep reinforcement learning

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

With the widespread adoption of electric vehicles (EVs), the demand for public charging services is steadily increasing. Consequently, the development of effective charging scheduling strategies, aimed at optimizing the utilization of limited charging infrastructure, has become a key problem. Considering the diversity of user demands, we propose a Cost-Aware Charging Scheduling Architecture (CASA). This architecture considers both urgent and nonurgent charging customers by designing two charging modes with different power levels and associated costs. However, optimizing multiple objectives simultaneously while ensuring the interests of all parties involved in the charging demand response presents a challenge. Moreover, the uncertainty in customer charging demands and Time-of-Use (TOU) tariff further complicates the establishment of the model. To address the aforementioned challenges, this study formulates EV charging scheduling as a Markov Decision Process (MDP) based on deep reinforcement learning (DRL), employing the Deep Q-Network (DQN) algorithm for solution derivation. The objective is to minimize the operational costs of charging stations while ensuring the quality of service (QoS) requirements for customers. The simulation results demonstrate that CASA exhibits superior performance in optimizing both the average response time and service success rate, compared to commonly used baselines for charging scheduling. Furthermore, the CASA approach achieves a significant reduction in operating costs of EV charging station.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

Data will be made available on reasonable request.

Code availability

Code is available at: https://github.com/zaNCEPU/CASA.

References

  1. Abdullah HM, Gastli A, Ben-Brahim L (2021) Reinforcement learning based EV charging management systems—a review. IEEE Access 9:41506–41531

    Article  Google Scholar 

  2. Alinia B, Hajiesmaili MH, Crespi N (2019) Online EV charging scheduling with on-arrival commitment. IEEE Trans Intell Transp Syst 20(12):4524–4537

    Article  Google Scholar 

  3. Barron E, Ishii H (1989) The Bellman equation for minimizing the maximum cost. Nonlinear Anal Theory Methods Appl 13(9):1067–1090

    Article  MathSciNet  Google Scholar 

  4. Chen Q, Wang F, Hodge BM, Zhang J, Li Z, Shafie-Khah M, Catalão JP (2017) Dynamic price vector formation model-based automatic demand response strategy for PV-assisted EV charging stations. IEEE Trans Smart Grid 8(6):2903–2915

    Article  Google Scholar 

  5. Chen K, Ma Z, Zhou S, Shen X, Lin H (2020) Charging control strategy for electric vehicles based on two-stage multi-target optimization. Power Syst Prot Control 48:65–72

    Google Scholar 

  6. Cheng L, Kalapgar A, Jain A, Wang Y, Qin Y, Li Y, Liu C (2022) Cost-aware real-time job scheduling for hybrid cloud using deep reinforcement learning. Neural Comput Appl 34(21):18579–18593

    Article  Google Scholar 

  7. Cheng L, Wang Y, Cheng F, Liu C, Zhao Z, Wang Y (2023) A deep reinforcement learning-based preemptive approach for cost-aware cloud job scheduling. IEEE Trans Sustain Comput. https://doi.org/10.1109/TSUSC.2023.3303898

  8. Chen L, Yang F, Wu S, Xing Q (2021) Electric vehicle charging navigation strategy based on data driven and deep reinforcement learning. In: Proceedings of the 5th international conference on control engineering and artificial intelligence, pp 16–23

  9. Ding T, Zeng Z, Bai J, Qin B, Yang Y, Shahidehpour M (2020) Optimal electric vehicle charging strategy with Markov decision process and reinforcement learning technique. IEEE Trans Ind Appl 56(5):5811–5823

    Article  Google Scholar 

  10. Ghosh A, Aggarwal V (2017) Control of charging of electric vehicles through menu-based pricing. IEEE Trans Smart Grid 9(6):5918–5929

    Article  Google Scholar 

  11. Guo J, Cheng L, Wang S (2023) CoTV: cooperative control for traffic light signals and connected autonomous vehicles using deep reinforcement learning. IEEE Trans Intell Transp Syst 24(10):10501–10512

    Article  Google Scholar 

  12. Hao L, Jin J, Xu Y (2022) Laxity differentiated pricing and deadline differentiated threshold scheduling for a public electric vehicle charging station. IEEE Trans Ind Inf 18(9):6192–6202

    Article  Google Scholar 

  13. Huang Y, Cheng L, Xue L, Liu C, Li Y, Li J, Ward T (2022) Deep adversarial imitation reinforcement learning for QoS-aware cloud job scheduling. IEEE Syst J 16(3):4232–4242

    Article  Google Scholar 

  14. Liu Q, Cheng L, Jia AL, Liu C (2021) Deep reinforcement learning for communication flow control in wireless mesh networks. IEEE Netw 35(2):112–119

    Article  Google Scholar 

  15. Liu Q, Cheng L, Ozcelebi T, Murphy J, Lukkien J (2019) Deep reinforcement learning for IoT network dynamic clustering in edge computing. In: IEEE/ACM international symposium on cluster, cloud and grid computing. IEEE, pp 600–603

  16. Manchella K, Haliem M, Aggarwal V, Bhargava B (2021) Passgoodpool: joint passengers and goods fleet management with reinforcement learning aided pricing, matching, and route planning. IEEE Trans Intell Transp Syst 23(4):3866–3877

    Article  Google Scholar 

  17. Mnih V, Kavukcuoglu K, Silver D, Rusu AA, Veness J, Bellemare MG, Graves A, Riedmiller M, Fidjeland AK, Ostrovski G et al (2015) Human-level control through deep reinforcement learning. Nature 518(7540):529–533

    Article  Google Scholar 

  18. Outlook IGE et al (2019) Scaling-up the transition to electric mobility. International Energy Agency, Paris

    Google Scholar 

  19. Paraskevas A, Aletras D, Chrysopoulos A, Marinopoulos A, Doukas DI (2022) Optimal management for EV charging stations: a win-win strategy for different stakeholders using constrained deep Q-learning. Energies 15(7):2323

    Article  Google Scholar 

  20. Poullikkas A (2015) Sustainable options for electric vehicle technologies. Renew Sustain Energy Rev 41:1277–1287

    Article  Google Scholar 

  21. Sun B, Huang Z, Tan X, Tsang DH (2016) Optimal scheduling for electric vehicle charging with discrete charging levels in distribution grid. IEEE Trans Smart Grid 9(2):624–634

    Article  Google Scholar 

  22. Suresh P, Shobana S, Ramya G, Belsam Jeba Ananth MS (2023) Hybrid optimization enabled multi-aggregator-based charge scheduling of electric vehicle in internet of electric vehicles. Concurr Comput: Pract Exp 35(9):e7654

    Article  Google Scholar 

  23. Sutton RS, Barto AG (2018) Reinforcement learning: an introduction. MIT Press

  24. Torrado RR, Bontrager P, Togelius J, Liu J, Perez-Liebana D (2018) Deep reinforcement learning for general video game AI. In: 2018 IEEE conference on computational intelligence and games, pp 1–8

  25. Wan Z, Li H, He H, Prokhorov D (2018) Model-free real-time EV charging scheduling based on deep reinforcement learning. IEEE Trans Smart Grid 10(5):5246–5257

    Article  Google Scholar 

  26. Wang S, Bi S, Zhang YA (2019) Reinforcement learning for real-time pricing and scheduling control in EV charging stations. IEEE Trans Ind Inf 17(2):849–859

    Article  Google Scholar 

  27. Wang J, Guo C, Yu C, Liang Y (2022) Virtual power plant containing electric vehicles scheduling strategies based on deep reinforcement learning. Electr Power Syst Res 205:107714

    Article  Google Scholar 

  28. Yan Q, Zhang B, Kezunovic M (2018) Optimized operational cost reduction for an EV charging station integrated with battery energy storage and PV generation. IEEE Trans Smart Grid 10(2):2096–2106

    Article  Google Scholar 

  29. Yan L, Chen X, Zhou J, Chen Y, Wen J (2021) Deep reinforcement learning for continuous electric vehicles charging control with dynamic user behaviors. IEEE Trans Smart Grid 12(6):5124–5134

    Article  Google Scholar 

  30. Yan J, Huang Y, Gupta A, Gupta A, Liu C, Li J, Cheng L (2022) Energy-aware systems for real-time job scheduling in cloud data centers: A deep reinforcement learning approach. Comput Electr Eng 99:107688

    Article  Google Scholar 

  31. Yang L, Dong C, Wan CJ, Ng CT (2013) Electricity time-of-use tariff with consumer behavior consideration. Int J Prod Econ 146(2):402–410

    Article  Google Scholar 

  32. Zhang L, Li Y (2015) Optimal management for parking-lot electric vehicle charging by two-stage approximate dynamic programming. IEEE Trans Smart Grid 8(4):1722–1730

    Article  Google Scholar 

  33. Zhang Y, You P, Cai L (2018) Optimal charging scheduling by pricing for EV charging station with dual charging modes. IEEE Trans Intell Transp Syst 20(9):3386–3396

    Article  Google Scholar 

  34. Zhang C, Liu Y, Wu F, Tang B, Fan W (2020) Effective charging planning based on deep reinforcement learning for electric vehicles. IEEE Trans Intell Transp Syst 22(1):542–554

    Article  Google Scholar 

  35. Zhang J, Cheng L, Liu C, Zhao Z, Mao Y (2023) Cost-aware scheduling systems for real-time workflows in cloud: An approach based on genetic algorithm and deep reinforcement learning. Expert Syst Appl 234:120972

    Article  Google Scholar 

  36. Zhang Y, Chen X, Zhang Y (2022) Transfer deep reinforcement learning-based large-scale V2G continuous charging coordination with renewable energy sources. arXiv:2210.07013

  37. Zhao Z, Lee CK (2022) Dynamic pricing for EV charging stations: a deep reinforcement learning approach. IEEE Trans Transp Electr 8(2):2456–2468

    Article  Google Scholar 

  38. Zhao J, Wan C, Xu Z, Wang J (2015) Risk-based day-ahead scheduling of electric vehicle aggregator using information gap decision theory. IEEE Trans Smart Grid 8(4):1609–1618

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by supported by the Fundamental Research Funds for the Central Universities (2023YQ002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Long Cheng.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, A., Liu, Q., Liu, J. et al. CASA: cost-effective EV charging scheduling based on deep reinforcement learning. Neural Comput & Applic 36, 8355–8370 (2024). https://doi.org/10.1007/s00521-024-09530-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-024-09530-3

Keywords