Abstract
Gaussian mixture is a powerful statistic tool and has been widely used in the fields of information processing and data analysis. However, its model selection, i.e., the selection of number of Gaussians in the mixture, is still a difficult problem. Fortunately, the new established Bayesian Ying-Yang (BYY) harmony function becomes an efficient criterion for model selection on the Gaussian mixture modeling. In this paper, we propose a BYY split-and-merge EM algorithm for Gaussian mixture to maximize the BYY harmony function by splitting or merging the unsuited Gaussians in the estimated mixture obtained from the EM algorithm in each time dynamically. It is demonstrated well by the experiments that this BYY split-and-merge EM algorithm can make both model selection and parameter estimation efficiently for the Gaussian mixture modeling.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximun Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Soceity B 39, 1–38 (1977)
Akaike, H.: A New Look at the Statistical Model Identification. IEEE Transactions on Automatic Control 19, 716–723 (1974)
Scharz, G.: Estimating the Dimension of a Model. The Annals of Statistics 6, 461–464 (1978)
Xu, L.: Best Harmony, Unified RPCL and Automated Model Selection for Unsupervised and Supervised Learning on Gaussian Mixtures, Three-layer Nets and ME-RBF-SVM Models. International Journal of Neural Systems 11, 43–69 (2001)
Xu, L.: BYY Harmony Learning, Structural RPCL, and Topological Self-Organzing on Mixture Modes. Neural Networks 15, 1231–1237 (2002)
Ma, J., Wang, T., Xu, L.: A Gradient BYY harmony Learning Rule on Gaussian Mixture with Automated Model Selection. Neurocomputing 56, 481–487 (2004)
Ma, J., Gao, B., Wang, Y., et al.: Conjugate and Natural Gradient Rules for BYY Harmony Learning on Gaussian Mixture with Automated Model Selection. International Journal of Pattern Recognition and Artificial Intelligence 19(5), 701–713 (2005)
Ma, J., Wang, L.: BYY Harmony Learning on Finite Mixture: Adaptive Gradient Implementation and A Floating RPCL Mechanism. Neural Processing Letters 24(1), 19–40 (2006)
Ma, J., Liu, J.: The BYY Annealing Learning Algorithm for Gaussian Mixture with Automated Model Selection. Pattern Recognition 40, 2029–2037 (2007)
Ma, J., He, X.: A Fast Fixed-point BYY Harmony Learning Algorithm on Gaussian Mixture with Automated Model Selection. Pattern Recognition Letters 29(6), 701–711 (2008)
Ma, J.: Automated Model Selection (AMS) on Finite Mixtures: A Theoretical Analysis. In: Proceedings of International Joint Conference on Neural Networks, Vancouver, Canada, pp. 8255–8261 (2006)
Ma, J., Xu, L., Jordan, M.I.: Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures. Neural Computation 12(12), 2881–2907 (2000)
Ma, J., Xu, L.: Asymptotic Convergence Properties of the EM Algorithm with respect to the Overlap in the Mixture. Neurocomputing 68, 105–129 (2005)
Ma, J., Fu, S.: On the Correct Convergence of the EM Algorithm for Gaussian Mixtures. Pattern Recognition 38(12), 2602–2611 (2005)
Zhang, Z., Chen, C., Sun, J., et al.: EM Algorithms for Gaussian Mixtures with Split-and-Merge Operation. Pattern Recogniton 36, 1973–1983 (2003)
Verbeek, J.J., Vlassis, N., Kröse, B.: Efficient Greedy Learning of Gaussian Mixture Models. Neural Computation 15(2), 469–485 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, L., Ma, J. (2008). A BYY Split-and-Merge EM Algorithm for Gaussian Mixture Learning. In: Sun, F., Zhang, J., Tan, Y., Cao, J., Yu, W. (eds) Advances in Neural Networks - ISNN 2008. ISNN 2008. Lecture Notes in Computer Science, vol 5263. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-87732-5_67
Download citation
DOI: https://doi.org/10.1007/978-3-540-87732-5_67
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-87731-8
Online ISBN: 978-3-540-87732-5
eBook Packages: Computer ScienceComputer Science (R0)