Abstract
Many traditional machine learning and pattern recognition algorithms—as for example linear discriminant analysis (LDA) or principal component analysis (PCA)—optimize data representation with respect to an information theoretic criterion. For time series analysis these traditional techniques are typically insufficient. In this work we propose an extension to linear discriminant analysis that allows to learn a data representation based on an algebraic structure that is tailored for time series. Specifically we propose a generalization of LDA towards shift-invariance that is based on cyclic structures. We expand this framework towards more general structures, that allow to incorporate previous knowledge about the data at hand within the representation learning step. The effectiveness of this proposed approach is demonstrated on synthetic and real-world data sets. Finally, we show the interrelation of our approach to common machine learning and signal processing techniques.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The total variation is the trace of the covariance matrix (cf. [18]).
- 2.
\([\textbf{Z}]_{i,j}\) is constant for constant \(i-j\) and \(\textbf{x}^\textsf{T}{\textbf {P}}^{j-i}\textbf{x}= \textbf{x}^\textsf{T}{\textbf {P}}^{i-j}\textbf{x}\) (cf. [1]).
- 3.
The distribution of stationary signals is invariant with respect to time (\(\mathbb {E}\left\{ x_t\right\} \) is constant for all t and the covariance \(C(x_t, x_s)\) solely depends on the index/time difference \(|t-s|\)) [16].
- 4.
\(\kappa \)-circulant structures can also be used to model Wavelet-like structures [24].
- 5.
References
Bonenberger, C., Ertel, W., Schneider, M.: \(\kappa \)-circulant maximum variance bases. In: Edelkamp, S., Möller, R., Rueckert, E. (eds.) KI 2021. LNCS (LNAI), vol. 12873, pp. 17–29. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87626-5_2
Bonenberger, C., Ertel, W., Schwenker, F., Schneider, M.: Singular spectrum analysis and circulant maximum variance frames. In: Advances in Data Science and Adaptive Analysis (2022)
Bouvrie, J.: Notes on convolutional neural networks (2006)
Casale, P., Pujol, O., Radeva, P.: Personalization and user verification in wearable systems using biometric walking patterns. Pers. Ubiquit. Comput. 16(5), 563–580 (2012)
Christensen, O.: An introduction to frames and Riesz bases. ANHA, Springer, Cham (2016). https://doi.org/10.1007/978-3-319-25613-9
Dau, H.A., et al.: The UCR time series archive. IEEE/CAA J. Autom. Sinica 6(6), 1293–1305 (2019)
Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml
Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)
Golyandina, N., Zhigljavsky, A.: Singular spectrum analysis for time series. Springer Science & Business Media (2013)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. SSS, Springer, New York (2009). https://doi.org/10.1007/978-0-387-84858-7
Hoffmann, R., Wolff, M.: Intelligente Signalverarbeitung 1. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-45323-0
Jolliffe, I.T.: Principal component analysis, vol. 2. Springer (2002). https://doi.org/10.1007/b98835
Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)
Lv, X.G., Huang, T.Z.: A note on inversion of toeplitz matrices. Appl. Math. Lett. 20(12), 1189–1193 (2007)
Morgenshtern, V.I., Bölcskei, H.: A short course on frame theory. arXiv preprint arXiv:1104.4300 (2011)
Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of time series analysis, signal processing, and dynamics. Elsevier (1999)
Rusu, C., Dumitrescu, B., Tsaftaris, S.A.: Explicit shift-invariant dictionary learning. IEEE Signal Process. Lett. 21(1), 6–9 (2013)
Seber, G.A.: Multivariate observations. John Wiley & Sons (2009)
Serpedin, E., Chen, T., Rajan, D.: Mathematical foundations for signal processing, communications, and networking. CRC Press (2011)
Shumway, R.: Discriminant analysis for time series. Handbook Statist. 2, 1–46 (1982)
Sulam, J., Papyan, V., Romano, Y., Elad, M.: Multilayer convolutional sparse modeling: pursuit and dictionary learning. IEEE Trans. Signal Process. 66(15), 4090–4104 (2018)
Theodoridis, S., Koutroumbas, K.: Pattern recognition. Elsevier (2006)
Tosic, I., Frossard, P.: Dictionary learning: what is the right representation for my signal? IEEE Sig. Process. Mag. 28, 27–38 (2011)
Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of signal processing. Cambridge University Press (2014)
Acknowledgements
We are grateful for the careful review of the manuscript. We thank the reviewers for corrections and helpful comments. Additionally we would like to thank the maintainers of the UCI Machine Learning Repository and the UCR Time Series Archive for providing benchmark data sets.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Bonenberger, C., Ertel, W., Schneider, M., Schwenker, F. (2023). Structured Nonlinear Discriminant Analysis. In: Amini, MR., Canu, S., Fischer, A., Guns, T., Kralj Novak, P., Tsoumakas, G. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2022. Lecture Notes in Computer Science(), vol 13713. Springer, Cham. https://doi.org/10.1007/978-3-031-26387-3_3
Download citation
DOI: https://doi.org/10.1007/978-3-031-26387-3_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-26386-6
Online ISBN: 978-3-031-26387-3
eBook Packages: Computer ScienceComputer Science (R0)