Abstract:
Load forecasting plays a critical role in decision-making for power systems, including aspects such as unit commitment and economic dispatch. Over the past few decades, n...Show MoreMetadata
Abstract:
Load forecasting plays a critical role in decision-making for power systems, including aspects such as unit commitment and economic dispatch. Over the past few decades, numerous forecasting methods have been extensively researched. Various metrics have been proposed to assess the performance of different load forecasting techniques, including Mean Absolute Percentage Error (MAPE) and Root Mean Squared Error (RMSE), to aid in selecting the most suitable and accurate forecasting models. However, these metrics can only compare forecasts within the same load dataset, rather than across multiple datasets. To effectively compare and rank load forecasting performance across multiple datasets, we propose normalizing traditional metrics into skill scores. To facilitate this normalization, we first define and calculate the so-called reference performance and perfect performance. On this basis, skill scores of forecasts across multiple datasets can be computed and ranked accordingly. We carry out case studies using the GEFCom dataset and the Guangdong Power Company dataset to showcase the efficacy of this method in delivering a more rational assessment and ranking of load forecasting predictions.
Date of Conference: 28-30 November 2023
Date Added to IEEE Xplore: 25 January 2024
ISBN Information: