Abstract
In this paper we address the problem of estimating the parameters of a Gaussian mixture model. Although the EM (Expectation-Maximization) algorithm yields the maximum-likelihood solution it has many problems: (i) it requires a careful initialization of the parameters; (ii) the optimal number of kernels in the mixture may be unknown before-hand. We propose a criterion based on the entropy of the pdf (probability density function) associated to each kernel to measure the quality of a given mixture model, and a modification of the classical EM algorithm to find the optimal number of kernels in the mixture. We test this method with synthetic and real data and compare the results with those obtained with the classical EM with a fixed number of kernels.
Chapter PDF
Similar content being viewed by others
Keywords
- Maximum Entropy
- Gaussian Mixture Model
- Entropy Maximization Approach
- Fuse Kernel
- Neural Processing Letter
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Bishop, C.: Neural Networks for Pattern Recognition. Oxford Univ. Press, Oxford (1995)
Blake, C.L., Merz, C.J.: UCI repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998)
Cover, T., Thomas, J.: Elements of Information Theory. J. Wiley and Sons, Chichester (1991)
Dempster, A., Laird, N., Rubin, D.: Maximum Likelihood estimation from incomplete data via the EM Algorithm. Journal of The Royal Statistical Society B 39, 1–38 (1977)
Figueiredo, M.A.T., Leitao, J.M.N., Jain, A.K.: On Fitting Mixture Models. In: Hancock, E.R., Pelillo, M. (eds.) EMMCVPR 1999. LNCS, vol. 1654, pp. 54–69. Springer, Heidelberg (1999)
Figueiredo, J.M.N., Jain, A.K.: Unsupervised Selection and Estimation of Finite Mixture Models. In: Proceedings of the International Conference on Pattern Recognition. ICPR2000, Barcelona (2000)
Parzen, E.: On estimation of a probability density function and mode. Annals of Mathematical Statistics 33, 1065–1076 (1962)
Redner, R.A., Walker, H.F.: Mixture Densities, Maximum Likelihood, and the EM Algorithm. SIAM Review 26(2), 195–239 (1984)
Rissanen, J.: Stochastic Complexity in Statistical Inquiry. World Scientific, Singapore (1989)
Shannon, C.: A Mathematical Theory of Comunication. The Bell System Technical Journal 27, 379–423, 623-656 (1948)
Ueda, N., Nakano, R., Ghahramani, Z., Hinton, G.E.: SMEM Algorithm for Mixture Models. Neural Computation 12, 2109–2128 (2000)
Viola, P., Wells III, W.M.: Alignment by Maximization of Mutual Information. In: 5th Intern. Conf. on Computer Vision, Cambridge, MA, pp. 16–23. IEEE, Los Alamitos (1995)
Viola, P., Schraudolph, N.N., Sejnowski, T.J.: Empirical Entropy Manipulation for Real-World Problems. Adv. in Neural Infor. Proces. Systems 8. MIT Press (1996)
Vlassis, N., Likas, A.: A Kurtosis-Based Dynamic Approach to Gaussian Mixture Modeling. IEEE Trans. Systems, Man, and Cybernetics 29(4), 393–399 (1999)
Vlassis, N., Likas, A., Krose, B.: A Multivariate Kurtosis-Based Dynamic Approach to Gaussian Mixture Modeling. Inteligent Autonomous Systems Tech. Report (2000)
Vlassis, N., Likas, A.: A Greedy EM Algorithm for Gaussian Mixture Learning. Neural Processing Letters (to appear)
Wolpert, D., Wolf, D.: Estimating Function of Probability Distribution from a Finite Set of Samples. Phisical Review E 52(6) (1995)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Peñalver, A., Sáez, J.M., Escolano, F. (2003). An Entropy Maximization Approach to Optimal Model Selection in Gaussian Mixtures. In: Sanfeliu, A., Ruiz-Shulcloper, J. (eds) Progress in Pattern Recognition, Speech and Image Analysis. CIARP 2003. Lecture Notes in Computer Science, vol 2905. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-24586-5_53
Download citation
DOI: https://doi.org/10.1007/978-3-540-24586-5_53
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-20590-6
Online ISBN: 978-3-540-24586-5
eBook Packages: Springer Book Archive