Skip to main content

An Effective Method to Improve Convergence for Sequential Blind Source Separation

  • Conference paper
Advances in Natural Computation (ICNC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3610))

Included in the following conference series:

  • 1914 Accesses

Abstract

Based on conventional natural gradient algorithm (NGA) and equivariant adaptive separation via independence algorithm (EASI), a novel sign algorithm for on-line blind separation of independent sources is presented. A sign operator for the adaptation of the separation model is obtained from the derivation of a generalized dynamic separation model. A variable step-size sign algorithm rooted in NGA is also derived to better match the dynamics of the input signals and unmixing matrix. The proposed algorithms are appealing in practice due to their computational simplicity. Experimental results verify the superior convergence performance over conventional NGA and EASI algorithm in both stationary and non-stationary environments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cichocki, A., Amari, S.: Adaptive blind signal and image processing: learning algorithms and applications. John Wiley & Sons, Chichester (2002)

    Book  Google Scholar 

  2. Chambers, J.A., Jafari, M.G., McLaughlin, S.: Variable step-size EASI algorithm for sequential blind source separation. Elect. Lett., 393–394 (2004)

    Google Scholar 

  3. Douglas, S.C., Cichocki, A.: On-line step-size selection for training of adaptive systems. IEEE Signal Processing Magazine (6), 45–46 (1997)

    Google Scholar 

  4. Georgiev, P., Cichocki, A., Amari, S.: On some extensions of the natural gradient algorithm. In: Proc. ICA, pp. 581–585 (2001)

    Google Scholar 

  5. Mathews, V.J., Xie, Z.: A stochastic gradient adaptive filter with gradient adaptive step size. IEEE Trans. Signal Process 41(6), 2075–2087 (1993)

    Article  Google Scholar 

  6. Cardoso, J.-F., Laheld, B.H.: Equivariant adaptive source separation. IEEE Trans. Signal Process 44, 3017–3030 (1996)

    Article  Google Scholar 

  7. Amari, S.: Natural Gradient Works Efficiently in Learning. Neural Computation 10, 251–276 (1998)

    Article  Google Scholar 

  8. Amari, S., Douglas, S.C.: Why Natural Gradient. In: Proc. IEEE International Conference Acoustics, Speech, Signal Processing, Seattle, WA, May 1998, vol. II, pp. 1213–1216 (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yuan, L., Sang, E., Wang, W., Chambers, J.A. (2005). An Effective Method to Improve Convergence for Sequential Blind Source Separation. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_22

Download citation

  • DOI: https://doi.org/10.1007/11539087_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28323-2

  • Online ISBN: 978-3-540-31853-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics