Skip to main content

GC-DAWMAR: A Global-Local Framework for Long-Term Time Series Forecasting

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2024)

Abstract

Current Long-term Time Series Forecasting (LTSF) approa-ches struggle to capture long-range correlations of prolonged time series. They lack efficient solutions for distribution shift, excessive stationarization, and overfitting caused by training noise. Global convolution and de-stationary autocorrelation are used in GC-DAWMAR, a long-term time series forecasting approach, to address these issues. The global-local architecture maintains translational invariance while capturing inter-subsequence relationships. The de-stationary autocorrelation technique prevents excessive stationarization, while exponential moving average optimization regularization reduces training overfitting. On three real datasets, the suggested LTSF technique outperforms baseline algorithms in prediction accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ahmed, S., et al.: Transformers in time-series analysis: a tutorial. arXiv preprint arXiv:2205.01138 (2022)

  2. Bai, S., et al.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018)

  3. Cho, Y., Kim, D., Kim, D., Khan, M.A., Choo, J.: Wavebound: dynamic error bounds for stable time series forecasting. arXiv preprint arXiv:2210.14303 (2022)

  4. Darji, M.P., Dabhi, V.K., Prajapati, H.B.: Rainfall forecasting using neural network: a survey. In: 2015 International Conference on Advances in Computer Engineering and Applications, pp. 706–713. IEEE (2015)

    Google Scholar 

  5. Du, Y., et al.: Adarnn: adaptive learning and forecasting of time series. In: Proceedings of the 30th ACM International Conference on Information & Knowledge Management, pp. 402–411 (2021)

    Google Scholar 

  6. Hyndman, R.J., Athanasopoulos, G.: Forecasting: principles and practice. OTexts (2018)

    Google Scholar 

  7. Ishida, T., Yamane, I., Sakai, T., Niu, G., Sugiyama, M.: Do we need zero training loss after achieving zero training error? arXiv preprint arXiv:2002.08709 (2020)

  8. Kim, T., Kim, J., Tae, Y., Park, C., Choi, J.H., Choo, J.: Reversible instance normalization for accurate time-series forecasting against distribution shift. In: International Conference on Learning Representations (2021)

    Google Scholar 

  9. Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)

    Google Scholar 

  10. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: Advances in Neural Information Processing Systems, vol. 32 (2019)

    Google Scholar 

  11. Liu, Y., et al.: Non-stationary transformers: exploring the stationarity in time series forecasting. In: Advances in Neural Information Processing Systems (2022)

    Google Scholar 

  12. Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y.: Long short-term memory neural network for traffic speed prediction using remote microwave sensor data. Transp. Res. Part C Emerg. Technol. 54, 187–197 (2015)

    Article  Google Scholar 

  13. Zhang, Y., Yan, J.: Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The Eleventh International Conference on Learning Representations (2022)

    Google Scholar 

  14. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)

    Google Scholar 

  15. Zhang, Y., et al.: Classification of EEG signals based on autoregressive model and wavelet packet decomposition. Neural Process. Lett. 45, 365–378 (2017)

    Article  Google Scholar 

  16. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, pp. 11106–11115 (2021)

    Google Scholar 

  17. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)

    Google Scholar 

  18. Zhao, Y., et al.: Gcformer: an efficient framework for accurate and scalable long-term multivariate time series forecasting. arXiv preprint arXiv:2306.08325 (2023)

  19. Gu, A., et al.: Efficiently modeling long sequences with structured state spaces. arXiv preprint arXiv:2111.00396 (2021)

  20. Li, Y., Cai, et al.: What makes convolutional models great on long sequence modeling? arXiv preprint arXiv:2210.09298 (2022)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yan Tang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ding, P., Tang, Y., Ding, X., Guo, C. (2024). GC-DAWMAR: A Global-Local Framework for Long-Term Time Series Forecasting. In: Cao, C., Chen, H., Zhao, L., Arshad, J., Asyhari, T., Wang, Y. (eds) Knowledge Science, Engineering and Management. KSEM 2024. Lecture Notes in Computer Science(), vol 14886. Springer, Singapore. https://doi.org/10.1007/978-981-97-5498-4_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-5498-4_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-5497-7

  • Online ISBN: 978-981-97-5498-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics