Skip to main content
Log in

Comparative Analysis on Convergence Rates of The EM Algorithm and Its Two Modifications for Gaussian Mixtures

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

For Gaussian mixture, a comparative analysis has been made on the convergence rate by the Expectation-Maximization (EM) algorithm and its two types of modifications. One is a variant of the EM algorithm (denoted by VEM) which uses the old value of mean vectors instead of the latest updated one in the current updating of the covariance matrices. The other is obtained by adding a momentum term in the EM updating equation, called the Momentum EM algorithm (MEM). Their up-bound convergence rates have been obtained, including an extension and a modification of those given in Xu & Jordan (1996). It has been shown that the EM algorithm and VEM are equivalent in their local convergence and rates, and that the MEM can speed up the convergence of the EM algorithm if a suitable amount of momentum is added. Moreover, a theoretical guide on how to add momentum is proposed, and a possible approach for further speeding up the convergence is suggested.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. A.P. Dempster, N.M. Laird and D.B. Rubin, “Maximum likelihood from incomplete data via the EM algorithm”, J. of Royal Statistical Society, B39, pp. 1–38, 1977.

    Google Scholar 

  2. J. Ma, L. Xu and M.I. Jordan, “Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures”, preprint, submitted to a journal, 1996.

  3. X.L. Meng, “On the rate of convergence of the ECM algorithm”, Ann. Statist., Vol. 22, pp. 326–339, 1994.

    Google Scholar 

  4. I. Meilijson, “A fast improvement to the EM algorithm on its own terms”, J. of Royal Statistical Society, B51, pp. 127–138, 1989.

    Google Scholar 

  5. B.C. Peters H.F. and Walker, “An iterative procedure for obtaining maximum-likelihood estimates of the parameters foe a mixture of normal distributions”, SIAM J. Applied Mathematics, Vol. 35, pp. 362–378, 1978.

    Google Scholar 

  6. R.A. Redner and H.F. Walker, “Mixture densities, maximum likelihood, and the EM algorithm”, SIAM Review, Vol. 26, pp. 195–239, 1984.

    Google Scholar 

  7. L. Xu and M.I. Jordan, “On convergence properties of the EM algorithm for gaussian mixtures”, Neural Computation, Vol. 8, No. 1, Jan, 1996, pp. 129–151, 1996.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Xu, L. Comparative Analysis on Convergence Rates of The EM Algorithm and Its Two Modifications for Gaussian Mixtures. Neural Processing Letters 6, 69–76 (1997). https://doi.org/10.1023/A:1009627306313

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009627306313

Navigation