Skip to main content
Log in

Bidirectional LSTM-RNN-based hybrid deep learning frameworks for univariate time series classification

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Time series classification (TSC) has been around for recent decades as a significant research problem for industry practitioners as well as academic researchers. Due to the rapid increase in temporal data in a wide range of disciplines, an incredible amount of algorithms have been proposed. This paper proposes robust approaches based on state-of-the-art techniques, bidirectional long short-term memory (BiLSTM), fully convolutional network (FCN), and attention mechanism. A BiLSTM considers both forward and backward dependencies, and FCN is proven to be good at feature extraction as a TSC baseline. Therefore, we augment BiLSTM and FCN in a hybrid deep learning architecture, BiLSTM-FCN. Moreover, we similarly explore the use of the attention mechanism to check its efficiency on BiLSTM-FCN and propose another model ABiLSTM-FCN. We validate the performance on 85 datasets from the University of California Riverside (UCR) univariate time series archive. The proposed models are evaluated in terms of classification testing error and f1-score and also provide performance comparison with various existing state-of-the-art techniques. The experimental results show that our proposed models perform comprehensively better than the existing state-of-the-art methods and baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig.1
Fig. 2
Fig. 3
Fig. 4
Fig.5
Fig.6

Similar content being viewed by others

References

  1. Esling P, Agon C (2012) Time-series data mining ACM computing surveys (CSUR) 45(1):12

    Google Scholar 

  2. Wei L, Keogh E (2006) Semi-supervised time series classification. In proceedings of the 12th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining

  3. Karim F, Majumdar S, Darabi H (2019) Adversarial attacks on time series. https://doi.org/10.1109/TPAMI.2020.2986319

    Article  Google Scholar 

  4. Chen Y, Keogh E, Hu B, Begum N, Bagnall A, Mueen A, Batista G (2015) The UCR time series classification archive. https://www.cs.ucr.edu/~eamonn/time_series_data/

  5. Dau HA et al. (2018) The ucr time series archive. https://doi.org/10.1109/JAS.2019.1911747

    Article  Google Scholar 

  6. Keogh E, Ratanamahatana CA (2005) Exact indexing of dynamic time warping. Knowl Inf Syst 7(3):358–386

    Article  Google Scholar 

  7. Rakthanmanon T et al. (2012) Searching and mining trillions of time series subsequences under dynamic time warping. Proceedings of the 18th ACM SIGKDD International Conference On Knowledge Discovery And Data Mining

  8. Lin J et al. (2007) Experiencing SAX: a novel symbolic representation of time series. Data Min Knowl Disc 15(2):107–144

    Article  MathSciNet  Google Scholar 

  9. Baydogan MG, Runger G, Tuv E (2013) A bag-of-features framework to classify time series. IEEE Trans Pattern Anal Mach Intell 35(11):2796–2802

    Article  Google Scholar 

  10. Schäfer P (2015) The boss is concerned with time series classification in the presence of noise. Data Min Knowl Disc 29(6):1505–1530

    Article  MathSciNet  Google Scholar 

  11. Schäfer P (2016) Scalable time series classification. Data Min Knowl Disc 30(5):1273–1298

    Article  MathSciNet  Google Scholar 

  12. Sch P et al. (2017) Fast and accurate time series classification with WEASEL, in Proceedings of the 2017 ACM on Conference On Information And Knowledge Management. ACM Singapore. 637–646

  13. Lines J, Bagnall A (2015) Time series classification with ensembles of elastic distance measures. Data Min Knowl Disc 29(3):565–592

    Article  MathSciNet  Google Scholar 

  14. Bagnall A et al (2015) Time-series classification with COTE: the collective of transformation-based ensembles. IEEE Trans Knowl Data Eng 27(9):2522–2535

    Article  Google Scholar 

  15. Wang Z, Yan W, Oates T (2017) Time series classification from scratch with deep neural networks: A strong baseline. In International Joint Conference On Neural Networks (IJCNN)

  16. Cui Z, Chen W,Chen Y (2016) Multi-scale convolutional neural networks for time series classification. arXiv preprint https://arxiv.org/abs/1603.06995

  17. Karim F et al. (2018) LSTM fully convolutional networks for time series classification. IEEE Access 6:1662–1669

    Article  Google Scholar 

  18. Ortego P et al. (2020) Evolutionary LSTM-FCN networks for pattern classification in industrial processes. Swarm Evolut Comput 54:100650

    Article  Google Scholar 

  19. Kim Y et al. (2018) Resource-efficient pet dog sound events classification using LSTM-FCN based on time-series data. Sensors 18(11):4019

    Article  Google Scholar 

  20. Hashida S, Tamura K (2019) Multi-channel MHLF: LSTM-FCN using MACD-histogram with multi-channel input for time series classification. in (2019) IEEE 11th International Workshop on Computational Intelligence and Applications (IWCIA)

  21. Budak Ü et al. (2019) Computer-aided diagnosis system combining FCN and Bi-LSTM model for efficient breast cancer detection from histopathological images. Appl Soft Comput 85:105765

    Article  Google Scholar 

  22. Jiang F, Chen H, Zhang L-J (2018) FCN-biLSTM based VAT invoice recognition and processing. IN INTERNATIONAL CONFERENCE ON EDGE COMPUTING, Springer

  23. Srivastava N et al. (2014) Dropout: a simple way to prevent neural networks from overfitting. J Machine Learn Res 15(1):1929–1958

    MathSciNet  MATH  Google Scholar 

  24. Ismail Fawaz H et al. (2019) Deep learning for time series classification: a review. Data Mining and Knowledge Discovery

  25. Graves A, Schmidhuber J (2005) Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neur Networks 18(5–6):602–610

    Article  Google Scholar 

  26. Schuster M, Paliwal KK (1997) Bidirectional recurrent neural networks. IEEE Trans Sign Proc 45(11):2673–2681

    Article  Google Scholar 

  27. Lipton ZC, Berkowitz J, Elkan C (2015) A critical review of recurrent neural networks for sequence learning. arXiv preprint https://arxiv.org/abs/1506.00019

  28. Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735–1780

    Article  Google Scholar 

  29. Zhao Y et al. (2018) Applying deep bidirectional LSTM and mixture density network for basketball trajectory prediction. Optik 158:266–272

    Article  Google Scholar 

  30. Graves A et al. (2009) A novel connectionist system for unconstrained handwriting recognition. IEEE Trans Pattern Anal Mach Intell 31(5):855–868

    Article  Google Scholar 

  31. Chen T et al. (2017) Improving sentiment analysis via sentence type classification using BiLSTM-CRF and CNN. Expert Syst Appl 72:221–230

    Article  Google Scholar 

  32. Graves A, Jaitly N, Mohamed Ar (2013) Hybrid speech recognition with deep bidirectional LSTM. in 2013 IEEE workshop on automatic speech recognition and understanding

  33. Ioffe S, Szegedy C (2015) Batch normalization: accelerating deep network training by reducing internal covariate shift. arXiv preprint https://arxiv.org/abs/1502.03167

  34. Nair V , Hinton GE (2010) Rectified linear units improve restricted boltzmann machines. In Proceedings of the 27th International Conference On Machine Learning (ICML-10)

  35. Lin M, Chen Q, Yan S (2013) Network in network. arXiv preprint https://arxiv.org/abs/1312.4400

  36. Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. arXiv preprint https://arxiv.org/abs/1409.0473

  37. Chorowski JK et al. (2015) Attention-based models for speech recognition. Adv Neural Info Process Systems. 28:577–585

    Google Scholar 

  38. Xu K et al. (2015) Show attend and tell: Neural image caption generation with visual attention. In International Conference On Machine Learning

  39. Zhou Q, Wu H (2018) NLP at IEST 2018: BiLSTM-Attention and LSTM-Attention via Soft Voting in Emotion Classification. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis

  40. Tang Y et al. (2016) Sequence-to-sequence model with attention for time series classification. In 2016 IEEE 16th International Conference On Data Mining Workshops (ICDMW)

  41. Vinayavekhin P et al. (2018) Focusing on what is relevant: time-series learning and understanding using attention. In 2018 24th International Conference On Pattern Recognition (ICPR)

  42. Kingma DP, Ba Adam J (2014) A method for stochastic optimization. arXiv preprint https://arxiv.org/abs/1412.6980

  43. Chollet F, Keras (2015) Available from: https://github.com/fchollet/keras

  44. Abadi M et al. (2016) Tensorflow: a system for large-scale machine learning. In 12th {USENIX} Symposium on operating systems design and implementation ({OSDI} 16)

  45. Nolan JR (1997) Estimating the true performance of classification-based nlp technology. In: From research to commercial applications: Making NLP Work in Practice

  46. Powers DM (2020) Evaluation: from precision, recall and F-measure to ROC, informedness, markedness and correlation. arXiv preprint https://arxiv.org/abs/2010.16061

Download references

Acknowledgments

This paper was partially supported by NSFC grant U1509216, U1866602, 61602129, and Microsoft Research Asia.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mehak Khan.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Khan, M., Wang, H., Riaz, A. et al. Bidirectional LSTM-RNN-based hybrid deep learning frameworks for univariate time series classification. J Supercomput 77, 7021–7045 (2021). https://doi.org/10.1007/s11227-020-03560-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-020-03560-z

Keywords

Navigation