Abstract
Principal component analysis (PCA), a well-known technique in machine learning and statistics, is typically applied to time-independent data, as it is based on point-wise correlations. Dynamic PCA (DPCA) handles this issue by augmenting the data set with lagged versions of itself. In this paper, we show that both, PCA and DPCA, are a special case of \(\kappa \)-circulant maximum variance bases. We formulate the constrained linear optimization problem of finding such \(\kappa \)-circulant bases and present a closed-form solution that allows further interpretation and significant speed-up for DPCA. Furthermore, the relation of the proposed bases to the discrete Fourier transform, finite impulse response filters as well as spectral density estimation is pointed out.
This work is supported by a grant from the German Ministry of Education and Research (BMBF; KMU-innovativ: Medizintechnik, 13GW0173E). We would like to express our gratitude to the reviewers for their efforts towards improving this work.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Matched filters are learned in a supervised setting, while here we restrict ourselves to the unsupervised case. Hence, the “matching” of the filter coefficients is according to a variance criterion (similar to PCA).
- 2.
- 3.
The dot product \(\mathbf {u}^T\mathbf {x}\) serves as measure of similarity.
- 4.
The discrete circular convolution of two sequences \(\mathbf {x},\mathbf {y}\in \mathbb {R}^{D}\) is written as \(\mathbf {x}\circledast \mathbf {y}\), while the linear convolution is written as \(\mathbf {x}*\mathbf {y}\).
- 5.
Due to the constraint \(\left\Vert \mathbf {g}\right\Vert _2^2=1\) this is not trivial.
- 6.
- 7.
References
Albawi, S., Mohammed, T.A., Al-Zawi, S.: Understanding of a convolutional neural network. In: 2017 International Conference on Engineering and Technology (ICET), pp. 1–6. IEEE (2017)
Bagnall, A., Lines, J., Bostrom, A., Large, J., Keogh, E.: The great time series classification bake off: a review and experimental evaluation of recent algorithmic advances. Data Min. Knowl. Disc. 31(3), 606–660 (2016). https://doi.org/10.1007/s10618-016-0483-9
Bengio, Y., Courville, A., Vincent, P.: Representation learning: a review and new perspectives. IEEE Trans. Pattern Anal. Mach. Intell. 35(8), 1798–1828 (2013)
Bose, A., Saha, K.: Random Circulant Matrices. CRC Press (2018)
Casazza, P.G., Kutyniok, G., Philipp, F.: Introduction to finite frame theory. Finite Frames, pp. 1–53 (2013)
Chatfield, C.: The Analysis of Time Series: An Introduction. Chapman and Hall/CRC (2003)
Ismail Fawaz, H., Forestier, G., Weber, J., Idoumghar, L., Muller, P.-A.: Deep learning for time series classification: a review. Data Min. Knowl. Disc. 33(4), 917–963 (2019). https://doi.org/10.1007/s10618-019-00619-1
Fulcher, B.D.: Feature-based time-series analysis. In: Feature Engineering for Machine Learning and Data Analytics, pp. 87–116. CRC Press (2018)
Garcia-Cardona, C., Wohlberg, B.: Convolutional dictionary learning: a comparative review and new algorithms. IEEE Trans. Comput. Imaging 4(3), 366–381 (2018)
Gerbrands, J.J.: On the relationships between SVD, KLT and PCA. Pattern Recogn. 14(1–6), 375–381 (1981)
Gray, R.M.: Toeplitz and Circulant Matrices: A Review (2006)
Jolliffe, I.T.: Principal components in regression analysis. In: Jolliffe, I.T. (ed.) Principal Component Analysis. SSS, pp. 129–155. Springer, New York (1986). https://doi.org/10.1007/978-1-4757-1904-8_8
Ku, W., Storer, R.H., Georgakis, C.: Disturbance detection and isolation by dynamic principal component analysis. Chemom. Intell. Lab. Syst. 30(1), 179–196 (1995)
Orfanidis, S.: SVD, PCA, KLT, CCA, and all that. Optimum Signal Processing, pp. 332–525 (2007)
Papyan, V., Romano, Y., Elad, M.: Convolutional neural networks analyzed via convolutional sparse coding. J. Mach. Learn. Res. 18(1), 2887–2938 (2017)
Pollock, D.S.G., Green, R.C., Nguyen, T.: Handbook of Time Series Analysis, Signal Processing, and Dynamics. Elsevier (1999)
Rusu, C.: On learning with shift-invariant structures. Digit. Signal Process. 99, 102654 (2020)
Seber, G.A.: Multivariate Observations, vol. 252. Wiley, Hoboken (2009)
Strang, G., Nguyen, T.: Wavelets and Filter Banks. SIAM (1996)
Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Process. Mag. 28(2), 27–38 (2011)
Unser, M.: On the approximation of the discrete Karhunen-Loeve transform for stationary processes. Signal Process. 7(3), 231–249 (1984)
Vaswani, N., Narayanamurthy, P.: Static and dynamic robust PCA and matrix completion: a review. Proc. IEEE 106(8), 1359–1379 (2018)
Vetterli, M., Kovačević, J., Goyal, V.K.: Foundations of Signal Processing. Cambridge University Press (2014)
Zhao, D., Lin, Z., Tang, X.: Laplacian PCA and its applications. In: 2007 IEEE 11th International Conference on Computer Vision, pp. 1–8. IEEE (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Bonenberger, C., Ertel, W., Schneider, M. (2021). \(\kappa \)-Circulant Maximum Variance Bases. In: Edelkamp, S., Möller, R., Rueckert, E. (eds) KI 2021: Advances in Artificial Intelligence. KI 2021. Lecture Notes in Computer Science(), vol 12873. Springer, Cham. https://doi.org/10.1007/978-3-030-87626-5_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-87626-5_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-87625-8
Online ISBN: 978-3-030-87626-5
eBook Packages: Computer ScienceComputer Science (R0)