Abstract:
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have sh...Show MoreMetadata
Abstract:
Time-series forecasting has a wide range of applications in areas such as financial market forecasting, weather forecasting, and sales forecasting. Recent studies have shown that Transformers have advantages in handling such problems, especially Long Sequence Time-Series Forecasting (LSTF) problems. When dealing with long sequence inputs in time-series data, transformers primarily focus on improving the attention mechanism. However, they do not address the underlying issue of the sequence length during attention computation. Many transformer models are loosely coupled with cellular neural network(CNN), which fail to fully exploit the local feature extraction capabilities. Therefore, we propose WDFormer for Long Sequence Time-Series Forecasting (LSTF), which introduces a local-global windowing mechanism to deal with long sequence time-series and proposes to use dilated causal convolutional layers instead of canonical convolutional layers to obtain exponential acceptable field growth at a slightly negligible computational cost to improve the long sequence forecasting accuracy. Our extensive experiments on four large datasets show that the WDFormer algorithm performs better than most baseline models on the LSTF problem.
Published in: 2024 27th International Conference on Computer Supported Cooperative Work in Design (CSCWD)
Date of Conference: 08-10 May 2024
Date Added to IEEE Xplore: 10 July 2024
ISBN Information: