Skip to main content

A Multi-view Feature Construction and Multi-Encoder-Decoder Transformer Architecture for Time Series Classification

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2024)

Abstract

Time series data plays a significant role in many research fields since it can record and disclose the dynamic trends of a phenomenon with a sequence of ordered data points. Time series data is dynamic, of variable length, and often contains complex patterns, which makes its analysis challenging especially when the amount of data is limited. In this paper, we propose a multi-view feature construction approach that can generate multiple feature sets of different resolutions from a single dataset and produce a fixed-length representation of variable-length time series data. Furthermore, we propose a multi-encoder-decoder Transformer (MEDT) architecture to effectively analyze these multi-view representations. Through extensive experiments using multiple benchmarks and a real-world dataset, our method shows significant improvement over the state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Lines, J., Taylor, S., Bagnall, A.: Time series classification with HIVE-COTE: the hierarchical vote collective of transformation-based ensembles. ACM Trans. Knowl. Disc. Data (TKDD) 12(5), 1–35 (2018)

    Article  Google Scholar 

  2. Dempster, A., Petitjean, F., Webb, G.I.: ROCKET: exceptionally fast and accurate time series classification using random convolutional kernels. Data Min. Knowl. Disc. 34(5), 1454–1495 (2020)

    Article  MathSciNet  Google Scholar 

  3. Shifaz, A., et al.: TS-CHIEF: a scalable and accurate forest algorithm for time series classification. Data Min. Knowl. Disc. 34(3), 742–775 (2020)

    Article  MathSciNet  Google Scholar 

  4. Amaral, K., et al.: SummerTime: variable-length time series summarization with application to physical activity analysis. ACM Trans. Comput. Healthcare 3(4), 1–15 (2022)

    Article  Google Scholar 

  5. Vaswani, A., et al.: Attention is all you need. Adv. Neural Inf. Process. Syst. 30 (2017)

    Google Scholar 

  6. Bahri, O., Li, P., Boubrahimi, S.F., Hamdi, S.M.: Shapelet-based temporal association rule mining for multivariate time series classification. IEEE Xplore (2022). https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=& arnumber=10020478. Accessed 02 Sept 2023

  7. Zerveas, G., et al.: A transformer-based framework for multivariate time series representation learning. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining (2021)

    Google Scholar 

  8. Ismail Fawaz, H., et al.: Inceptiontime: finding alexnet for time series classification. Data Mining Knowl. Disc. 34(6), 1936–1962 (2020)

    Article  MathSciNet  Google Scholar 

  9. Baldán, F.J., Benítez, J.M.: Multivariate times series classification through an interpretable representation. Inf. Sci. 569, 596–614 (2021)

    Article  MathSciNet  Google Scholar 

  10. Bier, A., Jastrzȩbska, A., Olszewski, P.: Variable-length multivariate time series classification using ROCKET: a case study of incident detection. IEEE Access 10, 95701–95715 (2022)

    Article  Google Scholar 

  11. Ismail Fawaz, H., et al.: Deep learning for time series classification: a review. Data Mining Knowl. Disc. 33(4), 917–963 (2019)

    Article  MathSciNet  Google Scholar 

  12. Zhou, X., et al.: Multi-encoder-decoder transformer for code-switching speech recognition. arXiv preprint arXiv:2006.10414 (2020)

  13. Li, S., et al.: Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. Adv. Neural Inf. Process. Syst. 32 (2019)

    Google Scholar 

  14. Wu, N., et al.: Deep transformer models for time series forecasting: the influenza prevalence case. arXiv preprint arXiv:2001.08317 (2020)

  15. Bagnall, A., et al.: The UEA multivariate time series classification archive, 2018. arXiv preprint arXiv:1811.00075 (2018)

  16. Crouter, S.E., Clowers, K.G., Bassett, D.R., Jr.: A novel method for using accelerometer data to predict energy expenditure. J. Appl. Physiol. 100(4), 1324–1331 (2006)

    Article  Google Scholar 

  17. Staudenmayer, J., et al.: An artificial neural network to estimate physical activity energy expenditure and identify physical activity type from an accelerometer. J. Applied Physiol. 107(4), 1300–1307 (2009)

    Article  Google Scholar 

  18. Trost, S.G., et al.: Artificial neural networks to predict activity type and energy expenditure in youth. Med. Sci. Sports Exerc. 44(9), 1801 (2012)

    Article  Google Scholar 

  19. Aitkin, M., Wilson, G.T.: Mixture models, outliers, and the EM algorithm. Technometrics 22(3), 325–331 (1980)

    Article  Google Scholar 

  20. Xu, C., Tao, D., Xu, C.: A survey on multi-view learning. arXiv preprint arXiv:1304.5634 (2013)

  21. Dufter, P., Schmitt, M., Schütze, H.: Position information in transformers: an overview. Comput. Linguist. 48(3), 733–763 (2022)

    Article  Google Scholar 

  22. Costa-jussà, M.R., et al.: No language left behind: scaling human-centered machine translation. arXiv preprint arXiv:2207.04672 (2022)

  23. Hota, H.S., Richa, H., Shrivas, A.K.: Time series data prediction using sliding window based RBF neural network. Int. J. Comput. Intell. Res. 13(5), 1145–1156 (2017)

    Google Scholar 

  24. Devlin, J., et al.: Bert: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  25. Li, Y., Yang, M., Zhang, Z.: A survey of multi-view representation learning. IEEE Trans. Knowl. Data Eng. 31(10), 1863–1883 (2018)

    Article  Google Scholar 

  26. Xie, Z., et al.: Deep learning on multi-view sequential data: a survey. Artif. Intell. Rev. 56(7), 6661–6704 (2023)

    Article  Google Scholar 

  27. Li, H., et al.: MTS-LOF: medical time-series representation learning via occlusion-invariant features. arXiv preprint arXiv:2310.12451 (2023)

  28. Hao, Y., Cao, H.: A new attention mechanism to classify multivariate time series. In: Proceedings of the Twenty-Ninth International Joint Conference on Artificial Intelligence (2020)

    Google Scholar 

  29. Dempster, A., Schmidt, D.F., Webb, G.I.: Minirocket: a very fast (almost) deterministic transform for time series classification. In: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data Mining (2021)

    Google Scholar 

  30. Gao, G., et al.: A reinforcement learning-informed pattern mining framework for multivariate time series classification. In: The Proceeding of 31th International Joint Conference on Artificial Intelligence (IJCAI-2022) (2022)

    Google Scholar 

Download references

Acknowledgement

This material is based upon work partially supported by the National Institutes of Health under grant NIH 1R01DK129428-01A1 and National Science Foundation under NSF grants 2008202 and 2334665. Any opinion, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the funding agencies.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zihan Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, Z., Ding, W., Mashukov, I., Crouter, S., Chen, P. (2024). A Multi-view Feature Construction and Multi-Encoder-Decoder Transformer Architecture for Time Series Classification. In: Yang, DN., Xie, X., Tseng, V.S., Pei, J., Huang, JW., Lin, J.CW. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2024. Lecture Notes in Computer Science(), vol 14650. Springer, Singapore. https://doi.org/10.1007/978-981-97-2266-2_19

Download citation

  • DOI: https://doi.org/10.1007/978-981-97-2266-2_19

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-97-2265-5

  • Online ISBN: 978-981-97-2266-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics