Abstract
Non-stationarities are an ubiquitous phenomenon in time-series data, yet they pose a challenge to standard methodology: classification models and ICA components, for example, cannot be estimated reliably under distribution changes because the classic assumption of a stationary data generating process is violated. Conversely, understanding the nature of observed non-stationary behaviour often lies at the heart of a scientific question. To this end, we propose a novel unsupervised technique: Stationary Subspace Analysis (SSA). SSA decomposes a multi-variate time-series into a stationary and a non-stationary subspace. This factorization is a universal tool for furthering the understanding of non-stationary data. Moreover, we can robustify other methods by restricting them to the stationary subspace. We demonstrate the performance of our novel concept in simulations and present a real world application from Brain Computer Interfacing.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Blankertz, B., Kawanabe, M., Tomioka, R., Hohlefeld, F., Nikulin, V., Müller, K.-R.: Invariant common spatial patterns: Alleviating nonstationarities in brain-computer interfacing. In: Platt, J.C., Koller, D., Singer, Y., Roweis, S. (eds.) Advances in Neural Information Processing Systems, vol. 20, pp. 113–120. MIT Press, Cambridge (2008)
Engle, R.F., Granger, C.W.J.: Co-integration and error correction: Representation, estimation, and testing. Econometrica 55(2), 251–276 (1987)
Friedman, J., Rafsky, L.: Multivariate generalizations of the Wald-Wolfowitz and Smirnov two-sample tests. The Annals of Statistics 7(4), 697–717 (1979)
Heckman, J.J.: Sample selection bias as a specification error. Econometrica 47(1), 153–162 (1979)
Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. Wiley, New York (2001)
Jaynes, E.T.: Information theory and statistical mechanics. Physical Review 160, 620–630 (1957)
Murata, N., Kawanabe, M., Ziehe, A., Müller, K.-R., Amari, S.: On-line learning in changing environments with applications in supervised and unsupervised learning. Neural Networks 15(4-6), 743–760 (2002)
Plumbley, M.D.: Geometrical methods for non-negative ica: Manifolds, lie groups and toral subalgebras. Neurocomputing 67, 161–197 (2005)
Quiñonero-Candela, J., Sugiyama, M., Schwaighofer, A., Lawrence, N. (eds.): Dataset Shift in Machine Learning. MIT Press, Cambridge (2008)
Shimodaira, H.: Improving predictive inference under covariate shift by weighting the log-likelihood function. Journal of Statistical Planning and Inference 90(2), 227–244 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
von Bünau, P., Meinecke, F.C., Müller, KR. (2009). Stationary Subspace Analysis. In: Adali, T., Jutten, C., Romano, J.M.T., Barros, A.K. (eds) Independent Component Analysis and Signal Separation. ICA 2009. Lecture Notes in Computer Science, vol 5441. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-00599-2_1
Download citation
DOI: https://doi.org/10.1007/978-3-642-00599-2_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-00598-5
Online ISBN: 978-3-642-00599-2
eBook Packages: Computer ScienceComputer Science (R0)