Skip to main content

Structured Nonlinear Discriminant Analysis

  • Conference paper
  • First Online:
Machine Learning and Knowledge Discovery in Databases (ECML PKDD 2022)

Abstract

Many traditional machine learning and pattern recognition algorithms—as for example linear discriminant analysis (LDA) or principal component analysis (PCA)—optimize data representation with respect to an information theoretic criterion. For time series analysis these traditional techniques are typically insufficient. In this work we propose an extension to linear discriminant analysis that allows to learn a data representation based on an algebraic structure that is tailored for time series. Specifically we propose a generalization of LDA towards shift-invariance that is based on cyclic structures. We expand this framework towards more general structures, that allow to incorporate previous knowledge about the data at hand within the representation learning step. The effectiveness of this proposed approach is demonstrated on synthetic and real-world data sets. Finally, we show the interrelation of our approach to common machine learning and signal processing techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The total variation is the trace of the covariance matrix (cf. [18]).

  2. 2.

    \([\textbf{Z}]_{i,j}\) is constant for constant \(i-j\) and \(\textbf{x}^\textsf{T}{\textbf {P}}^{j-i}\textbf{x}= \textbf{x}^\textsf{T}{\textbf {P}}^{i-j}\textbf{x}\) (cf. [1]).

  3. 3.

    The distribution of stationary signals is invariant with respect to time (\(\mathbb {E}\left\{ x_t\right\} \) is constant for all t and the covariance \(C(x_t, x_s)\) solely depends on the index/time difference \(|t-s|\)) [16].

  4. 4.

    \(\kappa \)-circulant structures can also be used to model Wavelet-like structures [24].

  5. 5.

    Hence we can fully simplify analogously to the step from Eq. (17) to Eq. (18).

References

  1. Bonenberger, C., Ertel, W., Schneider, M.: \(\kappa \)-circulant maximum variance bases. In: Edelkamp, S., Möller, R., Rueckert, E. (eds.) KI 2021. LNCS (LNAI), vol. 12873, pp. 17–29. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87626-5_2

    Chapter  Google Scholar 

  2. Bonenberger, C., Ertel, W., Schwenker, F., Schneider, M.: Singular spectrum analysis and circulant maximum variance frames. In: Advances in Data Science and Adaptive Analysis (2022)

    Google Scholar 

  3. Bouvrie, J.: Notes on convolutional neural networks (2006)

    Google Scholar 

  4. Casale, P., Pujol, O., Radeva, P.: Personalization and user verification in wearable systems using biometric walking patterns. Pers. Ubiquit. Comput. 16(5), 563–580 (2012)

    Article  Google Scholar 

  5. Christensen, O.: An introduction to frames and Riesz bases. ANHA, Springer, Cham (2016). https://doi.org/10.1007/978-3-319-25613-9

  6. Dau, H.A., et al.: The UCR time series archive. IEEE/CAA J. Autom. Sinica 6(6), 1293–1305 (2019)

    Article  Google Scholar 

  7. Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml

  8. Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)

    Article  MathSciNet  Google Scholar 

  9. Golyandina, N., Zhigljavsky, A.: Singular spectrum analysis for time series. Springer Science & Business Media (2013)

    Google Scholar 

  10. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. SSS, Springer, New York (2009). https://doi.org/10.1007/978-0-387-84858-7

    Book  MATH  Google Scholar 

  11. Hoffmann, R., Wolff, M.: Intelligente Signalverarbeitung 1. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-45323-0

    Book  Google Scholar 

  12. Jolliffe, I.T.: Principal component analysis, vol. 2. Springer (2002). https://doi.org/10.1007/b98835

  13. Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)

    Article  Google Scholar 

  14. Lv, X.G., Huang, T.Z.: A note on inversion of toeplitz matrices. Appl. Math. Lett. 20(12), 1189–1193 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  15. Morgenshtern, V.I., Bölcskei, H.: A short course on frame theory. arXiv preprint arXiv:1104.4300 (2011)

  16. Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of time series analysis, signal processing, and dynamics. Elsevier (1999)

    Google Scholar 

  17. Rusu, C., Dumitrescu, B., Tsaftaris, S.A.: Explicit shift-invariant dictionary learning. IEEE Signal Process. Lett. 21(1), 6–9 (2013)

    Article  Google Scholar 

  18. Seber, G.A.: Multivariate observations. John Wiley & Sons (2009)

    Google Scholar 

  19. Serpedin, E., Chen, T., Rajan, D.: Mathematical foundations for signal processing, communications, and networking. CRC Press (2011)

    Google Scholar 

  20. Shumway, R.: Discriminant analysis for time series. Handbook Statist. 2, 1–46 (1982)

    Article  MathSciNet  MATH  Google Scholar 

  21. Sulam, J., Papyan, V., Romano, Y., Elad, M.: Multilayer convolutional sparse modeling: pursuit and dictionary learning. IEEE Trans. Signal Process. 66(15), 4090–4104 (2018)

    MathSciNet  MATH  Google Scholar 

  22. Theodoridis, S., Koutroumbas, K.: Pattern recognition. Elsevier (2006)

    Google Scholar 

  23. Tosic, I., Frossard, P.: Dictionary learning: what is the right representation for my signal? IEEE Sig. Process. Mag. 28, 27–38 (2011)

    Google Scholar 

  24. Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of signal processing. Cambridge University Press (2014)

    Google Scholar 

Download references

Acknowledgements

We are grateful for the careful review of the manuscript. We thank the reviewers for corrections and helpful comments. Additionally we would like to thank the maintainers of the UCI Machine Learning Repository and the UCR Time Series Archive for providing benchmark data sets.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Christopher Bonenberger .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bonenberger, C., Ertel, W., Schneider, M., Schwenker, F. (2023). Structured Nonlinear Discriminant Analysis. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13713. Springer, Cham. https://doi.org/10.1007/978-3-031-26387-3_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-26387-3_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-26386-6

  • Online ISBN: 978-3-031-26387-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics