Skip to main content

Univariate Time Series Forecasting via Interactive Learning

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14120))

  • 1028 Accesses

Abstract

For time series forecasting tasks, it is necessary to capture the temporal dependencies from observed variables. Although many deep learning models have gained good performance, they still lack an effective modeling of temporal dependencies. Additionally, statistical features of time series often change over time, resulting in distribution shift issues. This is also one of the main challenges for time series forecasting. In this paper, we propose a module called Interactive Temporal-spatial Attention (ITSA), which combines interactive convolution and attention mechanism to effectively model the dependence between time and suppress the distribution shift problem. First, the time series is normalized and decomposed into trend and seasonal components. We then use an interactive learning strategy to extract the temporal dependencies of observed values at different data resolutions. Next, a normalized temporal-spatial attention mechanism is employed to capture the temporal-spatial features of the time series to prevent information loss. Finally, the true distribution is obtained by inverting the normalized data to achieve the purpose of suppressing the distribution shift. We employ a hierarchical way to stack the proposed ITSA, namely HITSA, to complete the forecasting task. The experimental results show that the model has good predictive performance in datasets of electricity and MOOC, and is significantly superior to other baseline methods, which indicates that the proposed ITSA can extract representative features from time series.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bahadori, M.T., Lipton, Z.C.: Temporal-clustering invariance in irregular healthcare time series. arXiv preprint arXiv:1904.12206 (2019)

  2. D’Urso, P., De Giovanni, L., Massari, R.: Trimmed fuzzy clustering of financial time series based on dynamic time warping. Ann. Oper. Res. 299(1), 1379–1395 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  3. Graves, A., Graves, A.: Long short-term memory. Supervised sequence labelling with recurrent neural networks, pp. 37–45 (2012)

    Google Scholar 

  4. Hu, J., Shen, L., Sun, G.: Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7132–7141 (2018)

    Google Scholar 

  5. Lim, B., Arık, S.Ö., Loeff, N., Pfister, T.: Temporal fusion transformers for interpretable multi-horizon time series forecasting. Int. J. Forecast. 37(4), 1748–1764 (2021)

    Article  Google Scholar 

  6. Lipton, Z.C., Berkowitz, J., Elkan, C.: A critical review of recurrent neural networks for sequence learning. arXiv preprint arXiv:1506.00019 (2015)

  7. Liu, H., Wang, Z., Benachour, P., Tubman, P.: A time series classification method for behaviour-based dropout prediction. In: 2018 IEEE 18th International Conference on Advanced Learning Technologies (ICALT), pp. 191–195. IEEE (2018)

    Google Scholar 

  8. Oreshkin, B.N., Carpov, D., Chapados, N., Bengio, Y.: N-beats: neural basis expansion analysis for interpretable time series forecasting. In: International Conference on Learning Representations (2019)

    Google Scholar 

  9. Park, J., Woo, S., Lee, J.Y., Kweon, I.S.: Bam: Bottleneck attention module. arXiv preprint arXiv:1807.06514 (2018)

  10. Salehinejad, H., Sankar, S., Barfett, J., Colak, E., Valaee, S.: Recent advances in recurrent neural networks. arXiv preprint arXiv:1801.01078 (2017)

  11. Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)

    Google Scholar 

  12. Woo, S., Park, J., Lee, J.-Y., Kweon, I.S.: CBAM: convolutional block attention module. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11211, pp. 3–19. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01234-2_1

    Chapter  Google Scholar 

  13. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural. Inf. Process. Syst. 34, 22419–22430 (2021)

    Google Scholar 

  14. Xiong, B., Lou, L., Meng, X., Wang, X., Ma, H., Wang, Z.: Short-term wind power forecasting based on attention mechanism and deep learning. Electric Power Syst. Res. 206, 107776 (2022)

    Article  Google Scholar 

  15. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: Proceedings of the AAAI Conference on Artificial Intelligence, pp. 11106–11115 (2021)

    Google Scholar 

Download references

Acknowledgment

This work is supported by the National Natural Science Foundation of China [grant numbers 62162062], the Science and Technology Project of Jilin Provincial Education Department [JJKH20220538KJ, JJKH20230622KJ].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Peng Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, Y., Li, H., Wang, P., Cui, X., Zhang, Z. (2023). Univariate Time Series Forecasting via Interactive Learning. In: Jin, Z., Jiang, Y., Buchmann, R.A., Bi, Y., Ghiran, AM., Ma, W. (eds) Knowledge Science, Engineering and Management. KSEM 2023. Lecture Notes in Computer Science(), vol 14120. Springer, Cham. https://doi.org/10.1007/978-3-031-40292-0_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-40292-0_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-40291-3

  • Online ISBN: 978-3-031-40292-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics