Skip to main content

Advertisement

Log in

GDRNet: a channel grouping based time-slice dilated residual network for long-term time-series forecasting

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Accurately capturing inter-series and intra-series variations is crucial for multivariate long-term time-series forecasting. Existing channel-independent and channel-mixing approaches struggle with complex inter-series relationships, while RNN-based models face challenges in capturing long-term intra-series dependencies. Additionally, current decomposition methods struggle with complex trends in the series, further hindering intra-series modeling. To address these, we propose GDRNet, which consists of four components: the channel grouping block (CGB), the channel group multi-mixer block (CGMB), the time-slice dilated residual GRU (SDRGRU), and the multi-trend decomposition block (MTDB). CGB groups channels with similar distributions for inter-series learning, while CGMB captures complex dependencies between series across various granularities and perspectives. SDRGRU expands the receptive field and incorporates residual learning to capture long-term intra-series dependencies, while MTDB enhances trend-seasonal decomposition, further facilitating precise intra-series modeling. GDRNet achieving 9.12% and 22.30% improvements in multivariate and univariate forecasting tasks, respectively, showcases its effectiveness in time-series forecasting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

Data availability

The datasets supporting this study are openly available. ETT: https://github.com/zhouhaoyi/ETDataset. Traffic: http://pems.dot.ca.gov. Weather: https://www.bgc-jena.mpg.de/wetter. Electricity: https://archive.ics.uci.edu/ml/datasets/ElectricityLoadDiagrams20112014.

References

  1. Bai S, Kolter JZ, Koltun V (2018) An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271

  2. Cai W, Liang Y, Liu X, et al (2024) Msgnet: Learning multi-scale inter-series correlations for multivariate time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 11141–11149

  3. Challu C, Olivares KG, Oreshkin BN, et al (2023) Nhits: Neural hierarchical interpolation for time series forecasting. In: Proceedings of the AAAI conference on artificial intelligence, pp 6989–6997

  4. Chen L, Chen D, Shang Z et al (2023) Multi-scale adaptive graph neural network for multivariate time series forecasting. IEEE Trans Knowledge Data Eng 35(10):10748–10761

    Article  MATH  Google Scholar 

  5. Chen P, Zhang Y, Cheng Y, et al (2024) Pathformer: Multi-scale transformers with adaptive pathways for time series forecasting. arXiv preprint arXiv:2402.05956

  6. Chung J, Gulcehre C, Cho K, et al (2014) Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555

  7. Ekambaram V, Jati A, Nguyen N, et al (2023) Tsmixer: Lightweight mlp-mixer model for multivariate time series forecasting. In: Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp 459–469

  8. Elman JL (1990) Finding structure in time. Cognit Sci 14(2):179–211

    Article  MATH  Google Scholar 

  9. Fan W, Zheng S, Yi X, et al (2022) Depts: Deep expansion learning for periodic time series forecasting. arXiv preprint arXiv:2203.07681

  10. Feng S, Miao C, Zhang Z, et al (2024) Latent diffusion transformer for probabilistic time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 11979–11987

  11. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  MATH  Google Scholar 

  12. Huang Q, Shen L, Zhang R et al (2023) Crossgnn: confronting noisy multivariate time series via cross interaction refinement. Adv Neural Inf Process Syst 36:46885–46902

    Google Scholar 

  13. Huang Q, Shen L, Zhang R, et al (2024) Hdmixer: Hierarchical dependency with extendable patch for multivariate time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 12608–12616

  14. Ilbert R, Odonnat A, Feofanov V, et al (2024) Samformer: Unlocking the potential of transformers in time series forecasting with sharpness-aware minimization and channel-wise attention. In: Forty-first International Conference on Machine Learning

  15. Karevan Z, Suykens JA (2020) Transductive lstm for time-series prediction: an application to weather forecasting. Neural Networks 125:1–9

    Article  Google Scholar 

  16. Lin S, Lin W, Wu W, et al (2024) Sparsetsf: Modeling long-term time series forecasting with 1k parameters. arXiv preprint arXiv:2405.00946

  17. Liu M, Zeng A, Chen M et al (2022) Scinet: time series modeling and forecasting with sample convolution and interaction. Adv Neural Inf Process Syst 35:5816–5828

    MATH  Google Scholar 

  18. Liu S, Yu H, Liao C, et al (2021) Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International conference on learning representations

  19. Liu Y, Hu T, Zhang H, et al (2023) itransformer: Inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625

  20. Luo D, Wang X (2024) Moderntcn: A modern pure convolution structure for general time series analysis. In: The Twelfth International Conference on Learning Representations

  21. Ma S, Miao S, Yao S et al (2025) Sdhnet: a sampling-based dual-stream hybrid network for long-term time series forecasting. J Supercomput 81(1):1–29

    Article  MATH  Google Scholar 

  22. Ma X, Li X, Fang L, et al (2024) U-mixer: An unet-mixer architecture with stationarity correction for time series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp 14255–14262

  23. Olivares KG, Challu C, Marcjasz G et al (2023) Neural basis expansion analysis with exogenous variables: forecasting electricity prices with nbeatsx. Int J Forecast 39(2):884–900

    Article  Google Scholar 

  24. Oreshkin BN, Carpov D, Chapados N, et al (2019) N-beats: Neural basis expansion analysis for interpretable time series forecasting. arXiv preprint arXiv:1905.10437

  25. Pekarčík P, Gajdoš A, Sokol P (2020) Forecasting security alerts based on time series. In: International Conference on Hybrid Artificial Intelligence Systems, Springer, pp 546–557

  26. Qin Y, Song D, Chen H, et al (2017) A dual-stage attention-based recurrent neural network for time series prediction. arXiv preprint arXiv:1704.02971

  27. Shabani A, Abdi A, Meng L, et al (2022) Scaleformer: Iterative multi-scale refining transformers for time series forecasting. arXiv preprint arXiv:2206.04038

  28. Vaswani A, Shazeer N, Parmar N, et al (2017) Attention is all you need. Advances in Neural Information Processing Systems

  29. Wang H, Peng J, Huang F, et al (2023a) Micn: Multi-scale local and global context modeling for long-term series forecasting. In: The eleventh international conference on learning representations

  30. Wang R, Miao S, Liu D et al (2023) Multi-layer seasonal perception network for time series forecasting. ICASSP 2023–2023 IEEE Int Conf Acoustics. IEEE, Speech and Signal Processing (ICASSP), pp 1–5

    Google Scholar 

  31. Wu H, Xu J, Wang J et al (2021) Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv Neural Inf Process Syst 34:22419–22430

    Google Scholar 

  32. Wu H, Hu T, Liu Y, et al (2022) Timesnet: Temporal 2d-variation modeling for general time series analysis. arXiv preprint arXiv:2210.02186

  33. Yu G, Zou J, Hu X, et al (2024) Revitalizing multivariate time series forecasting: Learnable decomposition with inter-series dependencies and intra-series variations modeling. arXiv preprint arXiv:2402.12694

  34. Zeng A, Chen M, Zhang L, et al (2023) Are transformers effective for time series forecasting? In: Proceedings of the AAAI conference on artificial intelligence, pp 11121–11128

  35. Zhang S, Ma X, Fang Z et al (2023) Financial time series forecasting based on momentum-driven graph signal processing. Appl Intell 53(18):20950–20966

    Article  MATH  Google Scholar 

  36. Zhang Y, Yan J (2023) Crossformer: Transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The eleventh international conference on learning representations

  37. Zheng Y, Liu Q, Chen E, et al (2014) Time series classification using multi-channels deep convolutional neural networks. In: International conference on web-age information management, Springer, pp 298–310

  38. Zhong S, Song S, Li G, et al (2023) A multi-scale decomposition mlp-mixer for time series analysis. arXiv preprint arXiv:2310.11959

  39. Zhou H, Zhang S, Peng J, et al (2021) Informer: Beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI conference on artificial intelligence, pp 11106–11115

  40. Zhou T, Ma Z, Wen Q, et al (2022) Fedformer: Frequency enhanced decomposed transformer for long-term series forecasting. In: International conference on machine learning, PMLR, pp 27268–27286

Download references

Funding

This work is supported by the National Key Research and Development Program of China under Grant No. 2024YFC3014300 and the Yunnan Province Major Science and Technology Project under Grant No. 202302AD080006 and the Tsinghua University Initiative Scientifc Research Program under Grant No. 2024Z04W01001 and the Research Foundation of the Department of Natural Resources of Hunan Province under Grant No. HBZ20240123.

Author information

Authors and Affiliations

Authors

Contributions

Qingda Bao, Shengfa Miao, and Xin Jin helped in conceptualization; Qingda Bao and Ruoshu Wang helped in methodology and investigation; Qingda Bao and Yulin Tian worked in software; Qingda Bao contributed to writing—original draft preparation; Qingda Bao, Shengfa Miao, Puming Wang, and Qian Jiang contributed to writing—review and editing; Shengfa Miao helped in funding acquisition; Shengfa Miao, Shaowen Yao, and Da Hu helped in resources; and Shengfa Miao worked in supervision.

Corresponding author

Correspondence to Shengfa Miao.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interest regarding the publication of this paper.

Consent for publication

The authors agree to the publication of this work and confirm that all the data used in the study are publicly accessible through the provided links.

Ethics approval

This research does not involve human participants nor animals.

Materials availability

Not applicable.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bao, Q., Miao, S., Tian, Y. et al. GDRNet: a channel grouping based time-slice dilated residual network for long-term time-series forecasting. J Supercomput 81, 499 (2025). https://doi.org/10.1007/s11227-025-07011-5

Download citation

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11227-025-07011-5

Keywords