Abstract
Long-term Time Series Forecasting (LTSF) is widely used in various fields, for example, power planning. LTSF requires models to capture subtle long-term dependencies in time series effectively. However, several challenges hinder the predictive performance of existing models, including the inability to exploit the correlation dependencies in time series fully, the difficulty in decoupling the complex cycles of time series in the time domain, and the error accumulation of iterative multi-step prediction. To address these issues, we design a Frequency Enhanced Decomposition and Expansion Learning (FEDEL) model for LTSF. The model has a linear complexity with three distinguishing features: (i) an extensive capacity depth regime that can effectively capture complex dependencies in long-term time series, (ii) decoupling of complex cycles using sparse representations of time series in the frequency domain, (iii) a direct multi-step prediction strategy to generate the prediction series, which can improve the prediction speed and avoid error accumulation. We have conducted extensive experiments on eight real-world large-scale datasets. The experimental results demonstrate that the FEDEL model performs significantly better than traditional methods and outperforms the current SOTA model in the field of LTSF in most cases.
This work was supported by the National Key R &D Program of China under Grant No. 2020YFB1710200 and Heilongjiang Key R & D Program of China under Grant No. GA23A915.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Ariyo, A.A., Adewumi, A.O., Ayo, C.K.: Stock price prediction using the ARIMA model. In: 2014 UKSim-AMSS 16th International Conference on Computer Modelling and Simulation, pp. 106–112. IEEE (2014)
Borovykh, A., Bohte, S., Oosterlee, C.W.: Conditional time series forecasting with convolutional neural networks. stat 1050, 17 (2018)
Box, G.E., Jenkins, G.M., MacGregor, J.F.: Some recent advances in forecasting and control. J. Roy. Stat. Soc.: Ser. C (Appl. Stat.) 23(2), 158–179 (1974)
Cleveland, R.B., Cleveland, W.S., McRae, J.E., Terpenning, I.: STL: a seasonal-trend decomposition. J. Off. Stat 6(1), 3–73 (1990)
Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29(5), 1189–1232 (2001)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Holt, C.C.: Forecasting trends and seasonals by exponentially weighted moving averages. ONR Memorandum 52(52), 5–10 (1957)
Kitaev, N., Kaiser, L., Levskaya, A.: Reformer: the efficient transformer. In: International Conference on Learning Representations (2020)
Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32 (2019)
Van den Oord, A., et al.: WaveNet: a generative model for raw audio. In: 9th ISCA Speech Synthesis Workshop, pp. 125–125 (2016)
Oreshkin, B.N., Carpov, D., Chapados, N., Bengio, Y.: N-beats: neural basis expansion analysis for interpretable time series forecasting. In: International Conference on Learning Representations (2020)
Salinas, D., Flunkert, V., Gasthaus, J., Januschowski, T.: DeepAR: probabilistic forecasting with autoregressive recurrent networks. Int. J. Forecast. 36(3), 1181–1191 (2020)
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Vecchia, A.: Periodic autoregressive-moving average (PARMA) modeling with applications to water resources 1. JAWRA J. Am. Water Resour. Assoc. 21(5), 721–730 (1985)
Whittle, P.: Hypothesis Testing in Time Series Analysis, vol. 4. Almqvist & Wiksells boktr. (1951)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: FEDformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Chen, R., Cui, W., Zhang, H., Han, Q. (2024). FEDEL: Frequency Enhanced Decomposition and Expansion Learning for Long-Term Time Series Forecasting. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Communications in Computer and Information Science, vol 1962. Springer, Singapore. https://doi.org/10.1007/978-981-99-8132-8_20
Download citation
DOI: https://doi.org/10.1007/978-981-99-8132-8_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8131-1
Online ISBN: 978-981-99-8132-8
eBook Packages: Computer ScienceComputer Science (R0)