Abstract:
We study the joint scheduling of deferrable demands (e.g., the charging of electric vehicles) and storage systems in the presence of random supply, demand arrivals, proce...Show MoreMetadata
Abstract:
We study the joint scheduling of deferrable demands (e.g., the charging of electric vehicles) and storage systems in the presence of random supply, demand arrivals, processing costs, and subject to processing rate limit constraint. We formulate the scheduling problem as a dynamic program so as to minimize the expected total cost, the sum of processing costs, and the noncompletion penalty (incurred when a task is not fully processed by its deadline). Under mild assumptions, we characterize an optimal index-based priority rule: Tasks with less laxity should be processed first, and for two tasks with the same laxity, the task with a later deadline has the priority. Based on the established optimal control policy characterizations (on resource allocation among multitasks and storage operation), we propose to apply data-driven reinforcement learning (RL) methods to make energy procurement decisions. Numerical results show that the proposed approach significantly outperforms existing RL methods combined with the earliest deadline first priority rule (by reducing 26%–32% of system cost).
Published in: IEEE Transactions on Automatic Control ( Volume: 66, Issue: 11, November 2021)