Skip to main content

Learning Hierarchical Representations in Temporal and Frequency Domains for Time Series Forecasting

  • Conference paper
  • First Online:
Pattern Recognition and Computer Vision (PRCV 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14433))

Included in the following conference series:

  • 437 Accesses

Abstract

Long-term time series forecasting is a critical task in many domains, including finance, healthcare, and weather forecasting. While Transformer-based models have made significant progress in time series forecasting, their high computational complexity often leads to compromises in model design, limiting the full utilization of temporal information. To address this issue, we propose a novel hierarchical decomposition framework that disentangles latent temporal variation patterns. Specifically, we decompose time series into trend and seasonal modes and further decompose seasonal temporal changes into coarse- and fine-grained states to capture different features of temporal sequences at different granularities. We use linear layers to embed local information for capturing fine-grained temporal changes and Fourier-domain attention to capture multi-periodic seasonal patterns to extract coarse-grained temporal dependency information. This forms a time series forecasting modeling from fine to coarse, and from local to global. Extensive experimental evaluation demonstrates that the proposed approach outperforms state-of-the-art methods on real-world benchmark datasets.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhang, Y., Cheung, Y.M.: Discretizing numerical attributes in decision tree for big data analysis. In: ICDMW, pp. 1150–1157. IEEE (2014)

    Google Scholar 

  2. Zhao, L., Zhang, Y., et al.: Heterogeneous drift learning: classification of mix-attribute data with concept drifts. In: DSAA, pp. 1–10. IEEE (2022)

    Google Scholar 

  3. Zeng, A., Rong, H., et al.: Discovery of genetic biomarkers for Alzheimers disease using adaptive convolutional neural networks ensemble and genome-wide association studies. Interdiscip. Sci. 13(4), 787–800 (2021)

    Article  Google Scholar 

  4. Zhang, Z., Zhang, Y., et al.: Time-series data imputation via realistic masking-guided tri-attention Bi-GRU. In: ECAI, pp. 1–9 (2023)

    Google Scholar 

  5. Zhao, M., Zhang, Y., et al.: Unsupervised concept drift detection via imbalanced cluster discriminator learning. In: PRCV, pp. 1–12 (2023)

    Google Scholar 

  6. Mittelman, R.: Time-series modeling with undecimated fully convolutional neural networks. arXiv preprint arXiv:1508.00317 (2015)

  7. Lai, G., Chang, W.C., et al.: Modeling long-and short-term temporal patterns with deep neural networks. In: SIGIR, pp. 95–104 (2018)

    Google Scholar 

  8. He, Y., Zhao, J.: Temporal convolutional networks for anomaly detection in time series. In: Journal of Physics: Conference Series, p. 042050 (2019)

    Google Scholar 

  9. Vaswani, A., Shazeer, N., et al.: Attention is all you need. In: NeurIPS, pp. 5998–6008 (2017)

    Google Scholar 

  10. Li, S., Jin, X., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: NeurIPS, pp. 5244–5254 (2019)

    Google Scholar 

  11. Kitaev, N., et al.: Reformer: the efficient transformer. In: ICLR (2020)

    Google Scholar 

  12. Cirstea, R., Guo, C., et al.: Triformer: triangular, variable-specific attentions for long sequence multivariate time series forecasting. In: IJCAI, pp. 1994–2001 (2022)

    Google Scholar 

  13. Zhou, H., Zhang, S., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI, pp. 11106–11115 (2021)

    Google Scholar 

  14. Flunkert, V., Salinas, D., et al.: Deepar: probabilistic forecasting with autoregressive recurrent networks. arXiv preprint arXiv:1704.04110 (2017)

  15. Li, Y., Moura, J.M.F.: Forecaster: a graph transformer for forecasting spatial and time-dependent data. In: ECAI, vol. 325, pp. 1293–1300 (2020)

    Google Scholar 

  16. Child, R., Gray, S., et al.: Generating long sequences with sparse transformers. arXiv preprint arXiv:1904.10509 (2019)

  17. Xu, K., Qin, M., et al.: Learning in the frequency domain. In: CVPR, pp. 1740–1749 (2020)

    Google Scholar 

  18. Guibas, J., Mardani, M., et al.: Adaptive fourier neural operators: efficient token mixers for transformers. arXiv preprint arXiv:2111.13587 (2021)

  19. Wu, H., Xu, J., et al.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. In: NeurIPS, pp. 22419–22430 (2021)

    Google Scholar 

  20. Zhou, T., Ma, Z., et al.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: ICML, pp. 27268–27286 (2022)

    Google Scholar 

  21. Woo, G., Liu, C., et al.: Etsformer: exponential smoothing transformers for timeseries forecasting. arXiv preprint arXiv:2202.01381 (2022)

  22. Jiang, S., Syed, T., et al.: Bridging self-attention and time series decomposition for periodic forecasting. In: CIKM, pp. 3202–3211 (2022)

    Google Scholar 

  23. Wang, H., Peng, J., et al: MICN: multi-scale local and global context modeling for long-term series forecasting. In: ICLR (2023)

    Google Scholar 

  24. Kim, T., Kim, J., et al.: Reversible instance normalization for accurate time-series forecasting against distribution shift. In: ICLR (2021)

    Google Scholar 

  25. UCI: Electricity. https://archive.ics.uci.edu/dataset/321/electricityloaddiagrams20112014

  26. Wetterstation: Weather. https://www.bgc-jena.mpg.de/wetter/

  27. CDC: Illness. https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html

  28. Liu, S., Yu, H., et al.: Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: ICLR (2021)

    Google Scholar 

  29. Liu, M., Zeng, A., et al.: Scinet: time series modeling and forecasting with sample convolution and interaction. In: NeurIPS, pp. 5816–5828 (2022)

    Google Scholar 

  30. Demsar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China (NSFC) under grants 62102097, 61976058, and 92267107, the Science and Technology Planning Project of Guangdong Province under grants: 2023A1515012855 and 2022A1515011592, 2021B0101220006, 2019A050510041, and 2021A1515012300, the Key Science and Technology Planning Project of Yunnan Province under grant 202102AA100012, and the Science and Technology Program of Guangzhou under grants 202201010548 and 202103000034.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yiqun Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, Z., Zhang, Y., Zeng, A., Pan, D., Zhang, X. (2024). Learning Hierarchical Representations in Temporal and Frequency Domains for Time Series Forecasting. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14433. Springer, Singapore. https://doi.org/10.1007/978-981-99-8546-3_8

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8546-3_8

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8545-6

  • Online ISBN: 978-981-99-8546-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics