Abstract
Long-term time series forecasting is a critical task in many domains, including finance, healthcare, and weather forecasting. While Transformer-based models have made significant progress in time series forecasting, their high computational complexity often leads to compromises in model design, limiting the full utilization of temporal information. To address this issue, we propose a novel hierarchical decomposition framework that disentangles latent temporal variation patterns. Specifically, we decompose time series into trend and seasonal modes and further decompose seasonal temporal changes into coarse- and fine-grained states to capture different features of temporal sequences at different granularities. We use linear layers to embed local information for capturing fine-grained temporal changes and Fourier-domain attention to capture multi-periodic seasonal patterns to extract coarse-grained temporal dependency information. This forms a time series forecasting modeling from fine to coarse, and from local to global. Extensive experimental evaluation demonstrates that the proposed approach outperforms state-of-the-art methods on real-world benchmark datasets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Zhang, Y., Cheung, Y.M.: Discretizing numerical attributes in decision tree for big data analysis. In: ICDMW, pp. 1150–1157. IEEE (2014)
Zhao, L., Zhang, Y., et al.: Heterogeneous drift learning: classification of mix-attribute data with concept drifts. In: DSAA, pp. 1–10. IEEE (2022)
Zeng, A., Rong, H., et al.: Discovery of genetic biomarkers for Alzheimers disease using adaptive convolutional neural networks ensemble and genome-wide association studies. Interdiscip. Sci. 13(4), 787–800 (2021)
Zhang, Z., Zhang, Y., et al.: Time-series data imputation via realistic masking-guided tri-attention Bi-GRU. In: ECAI, pp. 1–9 (2023)
Zhao, M., Zhang, Y., et al.: Unsupervised concept drift detection via imbalanced cluster discriminator learning. In: PRCV, pp. 1–12 (2023)
Mittelman, R.: Time-series modeling with undecimated fully convolutional neural networks. arXiv preprint arXiv:1508.00317 (2015)
Lai, G., Chang, W.C., et al.: Modeling long-and short-term temporal patterns with deep neural networks. In: SIGIR, pp. 95–104 (2018)
He, Y., Zhao, J.: Temporal convolutional networks for anomaly detection in time series. In: Journal of Physics: Conference Series, p. 042050 (2019)
Vaswani, A., Shazeer, N., et al.: Attention is all you need. In: NeurIPS, pp. 5998–6008 (2017)
Li, S., Jin, X., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: NeurIPS, pp. 5244–5254 (2019)
Kitaev, N., et al.: Reformer: the efficient transformer. In: ICLR (2020)
Cirstea, R., Guo, C., et al.: Triformer: triangular, variable-specific attentions for long sequence multivariate time series forecasting. In: IJCAI, pp. 1994–2001 (2022)
Zhou, H., Zhang, S., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI, pp. 11106–11115 (2021)
Flunkert, V., Salinas, D., et al.: Deepar: probabilistic forecasting with autoregressive recurrent networks. arXiv preprint arXiv:1704.04110 (2017)
Li, Y., Moura, J.M.F.: Forecaster: a graph transformer for forecasting spatial and time-dependent data. In: ECAI, vol. 325, pp. 1293–1300 (2020)
Child, R., Gray, S., et al.: Generating long sequences with sparse transformers. arXiv preprint arXiv:1904.10509 (2019)
Xu, K., Qin, M., et al.: Learning in the frequency domain. In: CVPR, pp. 1740–1749 (2020)
Guibas, J., Mardani, M., et al.: Adaptive fourier neural operators: efficient token mixers for transformers. arXiv preprint arXiv:2111.13587 (2021)
Wu, H., Xu, J., et al.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. In: NeurIPS, pp. 22419–22430 (2021)
Zhou, T., Ma, Z., et al.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: ICML, pp. 27268–27286 (2022)
Woo, G., Liu, C., et al.: Etsformer: exponential smoothing transformers for timeseries forecasting. arXiv preprint arXiv:2202.01381 (2022)
Jiang, S., Syed, T., et al.: Bridging self-attention and time series decomposition for periodic forecasting. In: CIKM, pp. 3202–3211 (2022)
Wang, H., Peng, J., et al: MICN: multi-scale local and global context modeling for long-term series forecasting. In: ICLR (2023)
Kim, T., Kim, J., et al.: Reversible instance normalization for accurate time-series forecasting against distribution shift. In: ICLR (2021)
UCI: Electricity. https://archive.ics.uci.edu/dataset/321/electricityloaddiagrams20112014
Wetterstation: Weather. https://www.bgc-jena.mpg.de/wetter/
CDC: Illness. https://gis.cdc.gov/grasp/fluview/fluportaldashboard.html
Liu, S., Yu, H., et al.: Pyraformer: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: ICLR (2021)
Liu, M., Zeng, A., et al.: Scinet: time series modeling and forecasting with sample convolution and interaction. In: NeurIPS, pp. 5816–5828 (2022)
Demsar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)
Acknowledgements
This work was supported in part by the National Natural Science Foundation of China (NSFC) under grants 62102097, 61976058, and 92267107, the Science and Technology Planning Project of Guangdong Province under grants: 2023A1515012855 and 2022A1515011592, 2021B0101220006, 2019A050510041, and 2021A1515012300, the Key Science and Technology Planning Project of Yunnan Province under grant 202102AA100012, and the Science and Technology Program of Guangzhou under grants 202201010548 and 202103000034.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, Z., Zhang, Y., Zeng, A., Pan, D., Zhang, X. (2024). Learning Hierarchical Representations in Temporal and Frequency Domains for Time Series Forecasting. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14433. Springer, Singapore. https://doi.org/10.1007/978-981-99-8546-3_8
Download citation
DOI: https://doi.org/10.1007/978-981-99-8546-3_8
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8545-6
Online ISBN: 978-981-99-8546-3
eBook Packages: Computer ScienceComputer Science (R0)