Abstract
In this paper, Comon’s conventional identifiability theorem for Independent Component Analysis (ICA) is extended to the case of mixtures where several gaussian sources are present. We show, in an original and constructive proof, that using the conventional mutual information minimization framework, the separation of all the non- gaussian sources is always achievable (up to scaling factors and permutations). In particular, we prove that a suitably designed optimization framework is capable of seamlessly handling both the case of one single gaussian source being present in the mixture (separation of all sources achievable), as well as the case of multiple gaussian signals being mixed together with non-gaussian signals (only the non-gaussian sources can be extracted).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Comon, P.: Independent component analysis, a new concept? Signal Processing 36 (1994) 287–314
Cruces, S., Cichocki, A., Castedo, L.: An iterative inversion approach to blind source separation. IEEE Trans. Neural Networks 11 (2000) 1423–1437
Cruces, S., Cichocki, A., Amari, S.: The minimum entropy and cumulant based contrast functions for blind source extraction. In Mira, J., Prieto, A., eds.: Bio-Inspired Applications of Connectionism, Lecture Notes in Computer Science, Springer-Verlag. [6th International Work-Conference on Artificial and Natural Neural Networks (IWANN’2001)]. Volume II, Granada, Spain (2001) 786–793
Cruces, S., Cichocki, A., i. Amari, S.: Criteria for the simultaneous blind extraction of arbitrary groups of sources. In Lee, T.W., Jung, T.W., Makeig, S., Sejnowski, T.J., eds.: Proceedings of the 3rd International Conference on Independent Component Analysis and Blind Signal Separation, San Diego, California, USA (2001) 740–745
Cardoso, J.F.: Blind signal separation: statistical principles. Proceedings of the IEEE. Special issue on blind identification and estimation 9 (1998) 2009–2025
Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM J. Matrix Anal. Appl. 20 (1999) 303–353
Obradovic, D., Deco, G.: Information maximization and independent component analysis: Is there a difference? Neural Computation 10 (1998) 2085–2101
Papoulis, A.: Probability, Random Variables, and Stochastic Processes. WCB/ McGraw-Hill (1991)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Boscolo, R., Pan, H., Roychowdhury, V.P. (2002). Beyond Comon’s Identifiability Theorem for Independent Component Analysis. In: Dorronsoro, J.R. (eds) Artificial Neural Networks — ICANN 2002. ICANN 2002. Lecture Notes in Computer Science, vol 2415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46084-5_181
Download citation
DOI: https://doi.org/10.1007/3-540-46084-5_181
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44074-1
Online ISBN: 978-3-540-46084-8
eBook Packages: Springer Book Archive