Skip to main content

TSEGformer: Time-Space Dimension Dependency Transformer for Use in Multivariate Time Series Prediction

  • Conference paper
  • First Online:
Web Information Systems Engineering – WISE 2023 (WISE 2023)

Abstract

Multivariate time series (MTS) prediction has always been an important part of sequence prediction. Recently, many researchers have proposed many deep learning models for multivariate time series prediction. Transformer-based models have shown great potential in this regard, as they can better capture the long-term dependencies between sequences, which has great advantages in sequence prediction tasks. However, many existing models focus on encoding and embedding time positions when processing time series, ignoring the dependencies between different dimensions at different times. MTS is not only related in the temporal dimension, but also in the spatial dimension, therefore we propose the TSEGformer. This is a transformer-based model that considers not only temporal and positional information, but also information between different dimensions in the sequence embedding section. Dimension Segment Mean Fusion (DSMF) is proposed, and the input MTS is embedded into a new 2D vector matrix by a module containing temporal and spatial information. Two Part Attention (TPA) layer has also been proposed to effectively capture the relationships between sequences across time and space dimensions. We established the Encoder-Decoder framework and conducted experiments on 5 real datasets, which yielded impressive results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anderson, O., Kendall, M.: Time-series, 2nd edn. J. R. Stat. Soc. (Series D) (1976)

    Google Scholar 

  2. Patton, A.: Copula methods for forecasting multivariate time series. Handb. Econ. Forecast. (2013)

    Google Scholar 

  3. Demirel, O.F., Zaim, S., Caliskan, A., Ozuyar, P.: Forecasting natural gas consumption in Istanbul using neural networks and multivariate time series methods. Turk. J. Electr. Eng. Comput. Sci. (2012)

    Google Scholar 

  4. Angryk, R.A., et al.: Multivariate time series dataset for space weather data analytics. Sci. Data (2020)

    Google Scholar 

  5. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting (2019). arXiv:1907.00235

  6. Vaswani, A., et al.: Attention is all you need. In: NeurIPS (2017)

    Google Scholar 

  7. Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: AAAI (2021)

    Google Scholar 

  8. Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting

    Google Scholar 

  9. Liu, S., et al.: PYRAFORMER: low-complexity pyramidal attention for long-range time series modeling and forecasting. In: International Conference on Learning Representations (ICLR) (2021a)

    Google Scholar 

  10. Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning (ICML) (2022)

    Google Scholar 

  11. Kitaev, N., Kaiser, L., Levskaya, A: Reformer: the efficient transformer. In: ICLR (2020)

    Google Scholar 

  12. Du, D., Su, B., Wei, Z.: Preformer: predictive transformer with multi-scale segment wise correlations for long-term time series forecasting (2022). arXiv preprint arXiv:2202.11356v1

  13. Wu, Z., Pan, S., Long, G., Jiang, J., Chang, X., Zhang, C.: Connecting the dots: multivariate time series forecasting with grap neural networks. In: ACM SIGKDD International Conference on Knowledge Discovery Data Mining (KDD) (2020)

    Google Scholar 

  14. Ariyo, A.A., Adewumi, A.O., Ayo, C.K.: Stock price prediction using the ARIMA model. In The 16th International Conference on Computer Modelling and Simulation, pp. 106–112. IEEE (2014). cating Backpropagation Through Time to Control Gradient Bias. arXiv:1905.07473

  15. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. (1997)

    Google Scholar 

  16. Lea, C., Flynn, M.D., Vidal, R., Reiter, A., Hager, G.D.: Temporal convolutional networks for action segmentation and detection. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2017)

    Google Scholar 

  17. Sen, R., Yu, H.F., Dhillon, I.S.: Think globally, act locally: a deep neural network approach to high-dimensional time series forecasting. In: NeurIPS (2019)

    Google Scholar 

  18. Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long- and short-term temporal patterns with deep neural networks. In: International ACM SIGIR Conference on Research Development in Information Retrieval (SIGIR) (2018)

    Google Scholar 

  19. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: Bert: pre-training of deep bidirectional transformers for language understanding. In: NAACL-HLT (2019)

    Google Scholar 

  20. Dosovitskiy, A., et al.: An image is worth \(16 \times 16\) words: transformers for image recognition at scale. In: International Conference on Learning Representations (ICLR) (2021)

    Google Scholar 

  21. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In: NeurIPS (2019)

    Google Scholar 

  22. Yunhao, Z., Junchi, Y.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting

    Google Scholar 

  23. Zhang, Y., Yan, J.: Crossformer: transformer utilizing crossdimension dependency for multivariate time series forecasting

    Google Scholar 

  24. Taylor, S.J., Letham, B.: Forecasting at scale. Am. Stat. (2018)

    Google Scholar 

  25. Shih, S.-Y., Sun, F.-K., Lee, H.-Y.: Temporal pattern attention for multivariate time series forecasting. Mach. Learn. (2019)

    Google Scholar 

  26. Song, H., Rajan, D., Thiagarajan, J., Spanias, A.: Attend and diagnose: clinical time series analysis using attention models. In: AAAI (2018)

    Google Scholar 

  27. Angryk, R.A., et al.: Multivariate time series dataset for space weather data analytics. Sci. Data (2020)

    Google Scholar 

  28. Paszke, A., et al.: Pytorch: an imperative style, high-performance deep learning library. In: NeurIPS (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qing Yu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Feng, Y., Yu, Q. (2023). TSEGformer: Time-Space Dimension Dependency Transformer for Use in Multivariate Time Series Prediction. In: Zhang, F., Wang, H., Barhamgi, M., Chen, L., Zhou, R. (eds) Web Information Systems Engineering – WISE 2023. WISE 2023. Lecture Notes in Computer Science, vol 14306. Springer, Singapore. https://doi.org/10.1007/978-981-99-7254-8_38

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-7254-8_38

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-7253-1

  • Online ISBN: 978-981-99-7254-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics