Skip to main content

Convergence Analysis of a Discrete-Time Single-Unit Gradient ICA Algorithm

  • Conference paper
Advances in Neural Networks - ISNN 2006 (ISNN 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3971))

Included in the following conference series:

  • 82 Accesses

Abstract

We revisit the one-unit gradient ICA algorithm derived from the kurtosis function. By carefully studying properties of the stationary points of the discrete-time one-unit gradient ICA algorithm, with suitable condition on the learning rate, convergence can be proved. The condition on the learning rate helps alleviate the guesswork that accompanies the problem of choosing suitable learning rate in practical computation. These results may be useful to extract independent source signals on-line.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. John Wiley and Sons, Chichester (2001)

    Book  Google Scholar 

  2. Hyvärinen, A., Oja, E.: A Fast Fixed-point Algorithm for Independent Component Analysis. Neural Computation 9(7), 1483–1492 (1997)

    Article  Google Scholar 

  3. Hyvärinen, A.: Fast and Robust Fixed-point Algorithms for Independent Component Analysis. IEEE Trans. Neural Networks 10(3), 626–634 (1999)

    Article  Google Scholar 

  4. Delfosse, N., Loubaton, P.: Adaptive Blind Separation of Independent Sources: A Deflation Approach. Signal processing 45, 59–83 (1995)

    Article  MATH  Google Scholar 

  5. Douglas, S.C.: On the Convergence Behavior of the FastICA Algorithm. In: Proc. of 4th International Symposium on Independent Component Analysis and Blind Signal Separation (ICA 2003), pp. 409–414 (2003)

    Google Scholar 

  6. Regalia, P.A., Kofidis, E.: Monotonic Convergence of Fixed-point Algorithms for ICA. IEEE Trans. Neural Networks 14(4), 943–949 (2003)

    Article  Google Scholar 

  7. Liu, Q., Chen, T.: Sequential Extraction Algorithm for BSS Without Error Accummulation. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3497, pp. 466–471. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ye, M., Li, X., Yang, C., Gao, Z. (2006). Convergence Analysis of a Discrete-Time Single-Unit Gradient ICA Algorithm. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_168

Download citation

  • DOI: https://doi.org/10.1007/11759966_168

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34439-1

  • Online ISBN: 978-3-540-34440-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics