Skip to main content

Csiszár’s Divergences for Non-negative Matrix Factorization: Family of New Algorithms

  • Conference paper
Independent Component Analysis and Blind Signal Separation (ICA 2006)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 3889))

Abstract

In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and outliers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the NMF algorithms to increase convergence speed and impose some desired constraints such as sparsity and smoothness of components. Moreover, the effects of various regularization terms and constraints are clearly shown. The scope of these results is vast since the proposed generalized divergence functions include quite large number of useful loss functions such as the squared Euclidean distance,Kulback-Leibler divergence, Itakura-Saito, Hellinger, Pearson’s chi-square, and Neyman’s chi-square distances, etc. We have applied successfully the developed algorithms to blind (or semi blind) source separation (BSS) where sources can be generally statistically dependent, however they satisfy some other conditions or additional constraints such as nonnegativity, sparsity and/or smoothness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S.: Differential-Geometrical Methods in Statistics. Springer, Heidelberg (1985)

    MATH  Google Scholar 

  2. Amari, S.: Information geometry of the EM and em algorithms for neural networks. Neural Networks 8, 1379–1408 (1995)

    Article  Google Scholar 

  3. Lee, D.D., Seung, H.S.: Learning of the parts of objects by non-negative matrix factorization. Nature 401, 788–791 (1999)

    Article  Google Scholar 

  4. Hoyer, P.: Non-negative matrix factorization with sparseness constraints. Journal of Machine Learning Research 5, 1457–1469 (2004)

    MathSciNet  Google Scholar 

  5. Sajda, P., Du, S., Parra, L.: Recovery of constituent spectra using non-negative matrix factorization. In: Proceedings of SPIE. Wavelets: Applications in Signal and Image Processing, vol. 5207, pp. 321–331 (2003)

    Google Scholar 

  6. Cho, Y.C., Choi, S.: Nonnegative features of spectro-temporal sounds for classification. Pattern Recognition Letters 26, 1327–1336 (2005)

    Article  Google Scholar 

  7. Lee, D.D., Seung, H.S.: Algorithms for nonnegative matrix factorization. NIPS, vol. 13. MIT Press, Cambridge (2001)

    Google Scholar 

  8. Cichocki, A., Amari, S.: Adaptive Blind Signal and Image Processing (New revised and improved edition). John Wiley, New York (2003)

    Google Scholar 

  9. Byrne, C.: Accelerating the EMML algorithm and related iterative algorithms by rescaled block-iterative (RBI) methods. IEEE Transactions on Image Processing 7, 100–109 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  10. Csiszár, I.: Information measures: A critical survey. In: Prague Conference on Information Theory, Academia Prague, vol. A, pp. 73–86 (1974)

    Google Scholar 

  11. Cressie, N.A., Read, T.: Goodness-of-Fit Statistics for Discrete Multivariate Data. Springer, New York (1988)

    MATH  Google Scholar 

  12. Kompass, R.: A generalized divergence measure for nonnegative matrix factorization. In: Neuroinfomatics Workshop, Torun, Poland (September 2005)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Cichocki, A., Zdunek, R., Amari, Si. (2006). Csiszár’s Divergences for Non-negative Matrix Factorization: Family of New Algorithms. In: Rosca, J., Erdogmus, D., Príncipe, J.C., Haykin, S. (eds) Independent Component Analysis and Blind Signal Separation. ICA 2006. Lecture Notes in Computer Science, vol 3889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11679363_5

Download citation

  • DOI: https://doi.org/10.1007/11679363_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-32630-4

  • Online ISBN: 978-3-540-32631-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics