Dynamic Pricing and Energy Consumption Scheduling With Reinforcement Learning | IEEE Journals & Magazine | IEEE Xplore

Dynamic Pricing and Energy Consumption Scheduling With Reinforcement Learning


Abstract:

In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility compa...Show More

Abstract:

In this paper, we study a dynamic pricing and energy consumption scheduling problem in the microgrid where the service provider acts as a broker between the utility company and customers by purchasing electric energy from the utility company and selling it to the customers. For the service provider, even though dynamic pricing is an efficient tool to manage the microgrid, the implementation of dynamic pricing is highly challenging due to the lack of the customer-side information and the various types of uncertainties in the microgrid. Similarly, the customers also face challenges in scheduling their energy consumption due to the uncertainty of the retail electricity price. In order to overcome the challenges of implementing dynamic pricing and energy consumption scheduling, we develop reinforcement learning algorithms that allow each of the service provider and the customers to learn its strategy without a priori information about the microgrid. Through numerical results, we show that the proposed reinforcement learning-based dynamic pricing algorithm can effectively work without a priori information about the system dynamics and the proposed energy consumption scheduling algorithm further reduces the system cost thanks to the learning capability of each customer.
Published in: IEEE Transactions on Smart Grid ( Volume: 7, Issue: 5, September 2016)
Page(s): 2187 - 2198
Date of Publication: 06 November 2015

ISSN Information:

Funding Agency:


References

References is not available for this document.