Abstract
Gaussian mixture models (GMMs) are widely used to model complex distributions. Usually the parameters of the GMMs are determined in a maximum likelihood (ML) framework. A practical deficiency of ML fitting of the GMMs is the poor performance when dealing with high-dimensional data since a large sample size is needed to match the numerical accuracy that is possible in low dimensions. In this paper we propose a method for fitting the GMMs based on the projection pursuit (PP) strategy. By means of simulations we show that the proposed method outperforms ML fitting of the GMMs for small sizes of training sets.
This work was supported in part by the Paul Ivanier Center for Robotics and Production Management, Ben-Gurion University of the Negev, Israel.
Chapter PDF
Similar content being viewed by others
Keywords
- Bayesian Information Criterion
- Gaussian Mixture Model
- Latent Variable Model
- Training Sample Size
- Gaussian Mixture Model Model
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Aladjem, M. E.: Linear discriminant analysis for two-classes via removal of classification structure. IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997) 187–192
Aladjem, M. E.: Non-parametric discriminant analysis via recursive optimization of Patrick-Fisher distance. IEEE Trans. on Syst., Man, Cybern. 28B (1998) 292–299
Aladjem, M. E.: Recursive training of neural networks for classification. IEEE Trans. on Neural Networks. 11 (2000) 488–503
Bishop, C. M.: Neural Networks for Pattern Recognition. Oxford University Press Inc., New York (1995)
Bishop, C. M.: Latent variable models. In: Jordan, M. I. (ed.): Learning in Graphical Models. The MIT Press, London (1999) 371–403
Fraley, C, R.aftery, A. E.: How many clusters? Which clustering method? Answers via model-based cluster analysis. The Computer Journal. 41 (1998) 578–588
Friedman, J. H.: Exploratory projection pursuit. Journal of the American Statistical Association. 82 (1987) 249–266
Friedman, J. H., Stuetzle, W., Schroeder, A.: Projection pursuit density estimation. Journal of the American Statistical Association. 79 (1984) 599–608
Hwang, J. N., Lay, S. R., Lippman, A.: Nonparametric multivariate density estimation: A comparative study. IEEE Trans. on Signal Processing. 42 (1994) 2795–2810
Moerland, P.: A comparison of mixture models for density estimation. In: Proceedings of the International Conference on Artificial Neural Networks (1999)
Sun, J.: Some practical aspects of exploratory projected pursuit. SIAM J. Sci. Comput. 14 (1993) 68–80
Wand, M. P., Jones, M. C.: Comparison of smoothing parameterizations in bivari-ate kernel density estimation. Journal of the American Statistical Association. 88 (1993) 520–528
Wand, M. P., Jones, M. C.: Kernel Smoothing. Charman & Hall/CRC (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Aladjem, M. (2002). Projection Pursuit Fitting Gaussian Mixture Models. In: Caelli, T., Amin, A., Duin, R.P.W., de Ridder, D., Kamel, M. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2002. Lecture Notes in Computer Science, vol 2396. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-70659-3_41
Download citation
DOI: https://doi.org/10.1007/3-540-70659-3_41
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44011-6
Online ISBN: 978-3-540-70659-5
eBook Packages: Springer Book Archive