Skip to main content

An Information Geometrical View of Stationary Subspace Analysis

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2011 (ICANN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6792))

Included in the following conference series:

Abstract

Stationary Subspace Analysis (SSA) [3] is an unsupervised learning method that finds subspaces in which data distributions stay invariant over time. It has been shown to be very useful for studying non-stationarities in various applications [5,10,4,9]. In this paper, we present the first SSA algorithm based on a full generative model of the data. This new derivation relates SSA to previous work on finding interesting subspaces from high-dimensional data in a similar way as the three easy routes to independent component analysis [6], and provides an information geometric view.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S.: Differential-geometrical Methods in Statistics. Lecture Notes in Statistics. Springer, Berlin (1985)

    Book  Google Scholar 

  2. Blanchard, G., Sugiyama, M., Kawanabe, M., Spokoiny, V., Müller, K.: In search of non-Gaussian components of a high-dimensional distribution. Journal of Machine Learning Research 7, 247–282 (2006)

    MathSciNet  MATH  Google Scholar 

  3. Bünau, P.v., Meinecke, F.C., Király, F., Müller, K.-R.: Finding stationary subspaces in multivariate time series. Physical Review Letters 103, 214101 (2009)

    Article  Google Scholar 

  4. Bünau, P.v., Meinecke, F.C., Müller, J.S., Lemm, S., Müller, K.-R.: Boosting High-Dimensional Change Point Detection with Stationary Subspace Analysis. In: Workshop on Temporal Segmentation at NIPS (2009)

    Google Scholar 

  5. von Bünau, P., Meinecke, F.C., Scholler, S., Müller, K.R.: Finding stationary brain sources in EEG data. In: Proceedings of the 32nd Annual Conference of the IEEE EMBS, pp. 2810–2813 (2010)

    Google Scholar 

  6. Cardoso, J.F.: The three easy routes to independent component analysis; contrasts and geometry. In: Proc. ICA 2001, pp. 1–6 (2001)

    Google Scholar 

  7. Diederichs, E., Juditsky, A., Spokoiny, V., Schtte, C.: Sparse non-gaussian component analysis. IEEE Trans. Inform. Theory 56, 3033–3047 (2010)

    Article  MathSciNet  Google Scholar 

  8. Friedman, J.H., Tukey, J.W.: A projection pursuit algorithm for exploratory data analysis. IEEE Trans. Computers 23, 881–890 (1974)

    Article  Google Scholar 

  9. Hara, S., Kawahara, Y., Washio, T., von Bünau, P.: Stationary subspace analysis as a generalized eigenvalue problem. In: Wong, K.W., Mendis, B.S.U., Bouzerdoum, A. (eds.) ICONIP 2010, Part I. LNCS, vol. 6443, pp. 422–429. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  10. Meinecke, F., von Bünau, P., Kawanabe, M., Müller, K.R.: Learning invariances with stationary subspace analysis. In: IEEE 12th International Conference on Computer Vision Workshops (ICCV Workshops), 2009 . pp. 87 –92 (2009)

    Google Scholar 

  11. Pham, D.T., Cardoso, J.F.: Blind separation of instantaneous mixtures of non stationary sources. In: Proc. ICA 2000, Helsinki, Finland, pp. 187–192 (2000)

    Google Scholar 

  12. Plumbley, M.D.: Geometrical methods for non-negative ica: Manifolds, lie groups and toral subalgebras. Neurocomputing 67(161-197) (2005)

    Article  Google Scholar 

  13. Theis, F.: Colored subspace analysis: Dimension reduction based on a signal’s autocorrelation structure. IEEE Trans. Circuits & Systems I 57(7), 1463–1474 (2010)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kawanabe, M., Samek, W., von Bünau, P., Meinecke, F.C. (2011). An Information Geometrical View of Stationary Subspace Analysis. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2011. ICANN 2011. Lecture Notes in Computer Science, vol 6792. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21738-8_51

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21738-8_51

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21737-1

  • Online ISBN: 978-3-642-21738-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics