Skip to main content

AutoTransformer: Automatic Transformer Architecture Design for Time Series Classification

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13280))

Abstract

Time series classification (TSC) aims to assign labels to time series. Deep learning methods, such as InceptionTime and Transformer, achieve promising performances in TSC. Although deep learning methods do not require manually crafted features, they do require careful manual design of the network structure. The design of architectures heavily relies on researchers’ prior knowledge and experience. Due to the limitations of human’s knowledge, the designed architecture may not be optimal on the dataset of interest. To automate and optimize the architecture design, we propose a data-driven TSC network architecture design method called AutoTransformer. AutoTransformer designs the suitable network architecture automatically depending on the target TSC dataset. Inspired by the overall architecture of Transformer, we first propose a novel search space tailored for TSC. The search space includes a variety of substructures that are capable of extracting global and local features from time series. Then, with the help of neural architecture search (NAS) technique, a suitable network architecture for the target TSC dataset can be found from the search space. Experimental results show that AutoTransformer finds proper architectures on different TSC datasets and outperforms state-of-the-art methods on the UCR archive. Ablation studies verify the effectiveness of the proposed search space.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    http://www.timeseriesclassification.com/.

References

  1. Bagnall, A., Lines, J., Bostrom, A., Large, J., Keogh, E.: The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Disc. 31(3), 606–660 (2016). https://doi.org/10.1007/s10618-016-0483-9

    Article  MathSciNet  Google Scholar 

  2. Baydogan, M.G., Runger, G., Tuv, E.: A bag-of-features framework to classify time series. PAMI 35(11), 2796–2802 (2013)

    Article  Google Scholar 

  3. Chen, Y., et al.: The UCR time series classification archive, July 2015. www.cs.ucr.edu/~eamonn/time_series_data/

  4. Dempster, A., Petitjean, F., Webb, G.I.: Rocket: exceptionally fast and accurate time series classification using random convolutional kernels. Data Min. Knowl. Disc. 34(5), 1454–1495 (2020)

    Article  MathSciNet  Google Scholar 

  5. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7(Jan), 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  6. Deng, H., Runger, G., Tuv, E., Vladimir, M.: A time series forest for classification and feature extraction. Inf. Sci. 239, 142–153 (2013)

    Article  MathSciNet  Google Scholar 

  7. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding. arXiv preprint arXiv:1810.04805 (2018)

  8. Dong, X., Yang, Y.: Searching for a robust neural architecture in four GPU hours. In: CVPR, pp. 1761–1770 (2019)

    Google Scholar 

  9. Esling, P., Agon, C.: Time-series data mining. ACM Comput. Surv. (CSUR) 45(1), 1–34 (2012)

    Article  Google Scholar 

  10. Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1

    Article  MathSciNet  MATH  Google Scholar 

  11. Fawaz, H.I., et al.: Inceptiontime: finding AlexNet for time series classification. arXiv preprint arXiv:1909.04939 (2019)

  12. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: CVPR, pp. 770–778 (2016)

    Google Scholar 

  13. Hills, J., Lines, J., Baranauskas, E., Mapp, J., Bagnall, A.: Classification of time series by shapelet transformation. Data Min. Knowl. Disc. 28(4), 851–881 (2013). https://doi.org/10.1007/s10618-013-0322-1

    Article  MathSciNet  MATH  Google Scholar 

  14. Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)

    Article  Google Scholar 

  15. Hu, S., et al.: DSNAS: direct neural architecture search without parameter retraining. In: CVPR, pp. 12084–12092 (2020)

    Google Scholar 

  16. Karim, F., Majumdar, S., Darabi, H., Chen, S.: LSTM fully convolutional networks for time series classification. IEEE Access 6, 1662–1669 (2017)

    Article  Google Scholar 

  17. Kate, R.J.: Using dynamic time warping distances as features for improved time series classification. Data Min. Knowl. Disc. 30(2), 283–312 (2015). https://doi.org/10.1007/s10618-015-0418-x

    Article  MathSciNet  MATH  Google Scholar 

  18. Lines, J., Bagnall, A.: Time series classification with ensembles of elastic distance measures. Data Min. Knowl. Disc. 29(3), 565–592 (2014). https://doi.org/10.1007/s10618-014-0361-2

    Article  MathSciNet  MATH  Google Scholar 

  19. Lines, J., Taylor, S., Bagnall, A.: Time series classification with hive-cote: the hierarchical vote collective of transformation-based ensembles. TKDD 12(5) (2018)

    Google Scholar 

  20. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: CVPR, pp. 3431–3440 (2015)

    Google Scholar 

  21. Schäfer, P.: The boss is concerned with time series classification in the presence of noise. Data Min. Knowl. Disc. 29(6), 1505–1530 (2015)

    Article  MathSciNet  Google Scholar 

  22. Shifaz, A., Pelletier, C., Petitjean, F., Webb, G.I.: TS-CHIEF: a scalable and accurate forest algorithm for time series classification. Data Min. Knowl. Discov. 1–34 (2020)

    Google Scholar 

  23. Vaswani, A., et al.: Attention is all you need. In: NeurIPS, pp. 5998–6008 (2017)

    Google Scholar 

  24. Wang, Z., Yan, W., Oates, T.: Time series classification from scratch with deep neural networks: a strong baseline. In: IJCNN, pp. 1578–1585. IEEE (2017)

    Google Scholar 

  25. Ye, L., Keogh, E.: Time series shapelets: a new primitive for data mining. In: SIGKDD, pp. 947–956 (2009)

    Google Scholar 

  26. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016)

  27. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. In: CVPR, pp. 8697–8710 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jun Zhou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ren, Y., Li, L., Yang, X., Zhou, J. (2022). AutoTransformer: Automatic Transformer Architecture Design for Time Series Classification. In: Gama, J., Li, T., Yu, Y., Chen, E., Zheng, Y., Teng, F. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2022. Lecture Notes in Computer Science(), vol 13280. Springer, Cham. https://doi.org/10.1007/978-3-031-05933-9_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-05933-9_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-05932-2

  • Online ISBN: 978-3-031-05933-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics