Skip to main content

Selecting Features from Time Series Using Attention-Based Recurrent Neural Networks

  • Conference paper
  • First Online:
Structural, Syntactic, and Statistical Pattern Recognition (S+SSPR 2021)

Abstract

Capturing, storing, and analyzing high-dimensional time series data are important challenges that need to be effectively tackled nowadays, as the extremely large amounts of such data are being generated every second. In this paper, we introduce the recurrent neural networks equipped with attention modules that quantify the importance of features, hence can be employed to select only an informative subset of all available features. Additionally, our models are trained in an end-to-end fashion, hence are directly applicable to infer over the unseen data. Our experiments included datasets from various domains and showed that the proposed technique is data-driven, easily applicable to new use cases, and competitive to other dimensionality reduction algorithms.

This work was co-financed by the Silesian University of Technology grant for maintaining and developing research potential, and by the Polish National Centre for Research and Development under Grant POIR.01.01.01-00-0853/19. JN was supported by the Silesian University of Technology funds (02/080/BKM20/0012).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Battiti, R.: Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5(4), 537–550 (1994)

    Article  Google Scholar 

  2. Boonyakitanont, P., Lek-uthai, A., Chomtho, K., Songsiri, J.: A review of feature extraction and performance evaluation in epileptic seizure detection using EEG. Biomed. Signal Process. Control 57, 101702 (2020)

    Article  Google Scholar 

  3. Dau, H.A., et al.: The UCR time series archive. arXiv preprint arXiv:1810.07758 (2019)

  4. Gui, N., Ge, D., Hu, Z.: AFS: an attention-based mechanism for supervised feature selection. In: Proceedings of AAAI CAI, vol. 33, pp. 3705–3713 (2019)

    Google Scholar 

  5. Hornik, K.: Approximation capabilities of multilayer feedforward networks. Neural Netw. 4(2), 251–257 (1991)

    Article  MathSciNet  Google Scholar 

  6. Li, X., Wang, Y., Ruiz, R.: A survey on sparse learning models for feature selection. IEEE Trans. Cybern., 1–19 (2020). https://doi.org/10.1109/TCYB.2020.2982445

  7. Masoudi-Sobhanzadeh, Y., Motieghader, H., Masoudi-Nejad, A.: FeatureSelect: a software for feature selection based on machine learning approaches. BMC Bioinform. 20(1), 170 (2019). https://doi.org/10.1186/s12859-019-2754-0

    Article  Google Scholar 

  8. Nalepa, J., Myller, M., Kawulok, M.: Transfer learning for segmenting dimensionally reduced hyperspectral images. IEEE Geosci. Remote Sens. Lett. 17(7), 1228–1232 (2020)

    Article  Google Scholar 

  9. Ribalta Lorenzo, P., Tulczyjew, L., Marcinkiewicz, M., Nalepa, J.: Hyperspectral band selection using attention-based convolutional neural networks. IEEE Access 8, 42384–42403 (2020)

    Article  Google Scholar 

  10. Zhi, R., Liu, M., Zhang, D.: A comprehensive survey on automatic facial action unit analysis. Visual Comput. 36(5), 1067–1093 (2019). https://doi.org/10.1007/s00371-019-01707-5

    Article  Google Scholar 

  11. Zhu, X., Wang, Y., Li, Y., Tan, Y., Wang, G., Song, Q.: A new unsupervised feature selection algorithm using similarity-based feature clustering. Comput. Intell. 35(1), 2–22 (2019)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jakub Nalepa .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Myller, M., Kawulok, M., Nalepa, J. (2021). Selecting Features from Time Series Using Attention-Based Recurrent Neural Networks. In: Torsello, A., Rossi, L., Pelillo, M., Biggio, B., Robles-Kelly, A. (eds) Structural, Syntactic, and Statistical Pattern Recognition. S+SSPR 2021. Lecture Notes in Computer Science(), vol 12644. Springer, Cham. https://doi.org/10.1007/978-3-030-73973-7_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-73973-7_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-73972-0

  • Online ISBN: 978-3-030-73973-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics