Abstract
Current Long-term Time Series Forecasting (LTSF) approa-ches struggle to capture long-range correlations of prolonged time series. They lack efficient solutions for distribution shift, excessive stationarization, and overfitting caused by training noise. Global convolution and de-stationary autocorrelation are used in GC-DAWMAR, a long-term time series forecasting approach, to address these issues. The global-local architecture maintains translational invariance while capturing inter-subsequence relationships. The de-stationary autocorrelation technique prevents excessive stationarization, while exponential moving average optimization regularization reduces training overfitting. On three real datasets, the suggested LTSF technique outperforms baseline algorithms in prediction accuracy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahmed, S., et al.: Transformers in time-series analysis: a tutorial. arXiv preprint arXiv:2205.01138 (2022)
Bai, S., et al.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018)
Cho, Y., Kim, D., Kim, D., Khan, M.A., Choo, J.: Wavebound: dynamic error bounds for stable time series forecasting. arXiv preprint arXiv:2210.14303 (2022)
Darji, M.P., Dabhi, V.K., Prajapati, H.B.: Rainfall forecasting using neural network: a survey. In: 2015 International Conference on Advances in Computer Engineering and Applications, pp. 706–713. IEEE (2015)
Du, Y., et al.: Adarnn: adaptive learning and forecasting of time series. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 402–411 (2021)
Hyndman, R.J., Athanasopoulos, G.: Forecasting: principles and practice. OTexts (2018)
Ishida, T., Yamane, I., Sakai, T., Niu, G., Sugiyama, M.: Do we need zero training loss after achieving zero training error? arXiv preprint arXiv:2002.08709 (2020)
Kim, T., Kim, J., Tae, Y., Park, C., Choi, J.H., Choo, J.: Reversible instance normalization for accurate time-series forecasting against distribution shift. In: International Conference on Learning Representations (2021)
Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)
Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32 (2019)
Liu, Y., et al.: Non-stationary transformers: exploring the stationarity in time series forecasting. In: Advances in Neural Information Processing Systems (2022)
Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C Emerg. Technol. 54, 187–197 (2015)
Zhang, Y., Yan, J.: Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The Eleventh International Conference on Learning Representations (2022)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)
Zhang, Y., et al.: Classification of EEG signals based on autoregressive model and wavelet packet decomposition. Neural Process. Lett. 45, 365–378 (2017)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)
Zhao, Y., et al.: Gcformer: an efficient framework for accurate and scalable long-term multivariate time series forecasting. arXiv preprint arXiv:2306.08325 (2023)
Gu, A., et al.: Efficiently modeling long sequences with structured state spaces. arXiv preprint arXiv:2111.00396 (2021)
Li, Y., Cai, et al.: What makes convolutional models great on long sequence modeling? arXiv preprint arXiv:2210.09298 (2022)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ding, P., Tang, Y., Ding, X., Guo, C. (2024). GC-DAWMAR: A Global-Local Framework for Long-Term Time Series Forecasting. In: Cao, C., Chen, H., Zhao, L., Arshad, J., Asyhari, T., Wang, Y. (eds) Knowledge Science, Engineering and Management. KSEM 2024. Lecture Notes in Computer Science(), vol 14886. Springer, Singapore. https://doi.org/10.1007/978-981-97-5498-4_8
Download citation
DOI: https://doi.org/10.1007/978-981-97-5498-4_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5497-7
Online ISBN: 978-981-97-5498-4
eBook Packages: Computer ScienceComputer Science (R0)