Abstract:
We analyze the Dynamic Mode Decomposition (DMD) algorithm as applied to multivariate time-series data. Our analysis reveals the critical role played by the lag-one cross-...Show MoreMetadata
Abstract:
We analyze the Dynamic Mode Decomposition (DMD) algorithm as applied to multivariate time-series data. Our analysis reveals the critical role played by the lag-one cross-correlation, or cross-covariance, terms. We show that when the rows of the multivariate time series matrix can be modeled as linear combinations of lag-one uncorrelated latent time series that have a non-zero lag-one autocorrelation, then in the large sample limit, DMD perfectly recovers, up to a column-wise scaling, the mixing matrix, and thus the latent time series. We validate our findings with numerical simulations, and demonstrate how DMD can be used to unmix mixed audio signals.
Date of Conference: 26-29 November 2018
Date Added to IEEE Xplore: 21 February 2019
ISBN Information: