Skip to main content

Sequential EM for Unsupervised Adaptive Gaussian Mixture Model Based Classifier

  • Conference paper
Machine Learning and Data Mining in Pattern Recognition (MLDM 2009)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5632))

Abstract

In this paper we present a sequential expectation maximization algorithm to adapt in an unsupervised manner a Gaussian mixture model for a classification problem. The goal is to adapt the Gaussian mixture model to cope with the non-stationarity in the data to classify and hence preserve the classification accuracy. Experimental results on synthetic data show that this method is able to learn the time-varying statistical features in data by adapting a Gaussian mixture model online. In order to control the adaptation method and to ensure the stability of the adapted model, we introduce an index to detect when the adaptation would fail.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.: Pattern recognition and machine learning. Springer, Heidelberg (2006)

    MATH  Google Scholar 

  2. Rasmussen, C.E.: The infinite Gaussian mixture model. In: Advances in Neural Information Processing Systems, vol. 12, pp. 554–560 (2000)

    Google Scholar 

  3. Cheng, S., Wang, H., Fu, H.: A model-selection-based self-splitting Gaussian mixture learning with application to speaker identification. EURASIP Journal on Applied Signal Processing 17, 2626–2639 (2004)

    Article  MATH  Google Scholar 

  4. Fraley, C., Raftery, A., Wehrensy, R.: Incremental model-based clustering for large datasets with small clusters. Tech. Rep. 439 (2003)

    Google Scholar 

  5. Shimada, A., Arita, D., Taniguchi, R.: Dynamic control of adaptive mixture-of-Gaussians background model. In: AVSS 2006. Proceedings of the IEEE International Conference on Video and Signal Based Surveillance, vol. 5 (2006)

    Google Scholar 

  6. Marques, J., Moreno, P.J.: A study of musical instrument classification using Gaussian mixture models and support vector machines. Tech. Rep. CRL 99/4 (1999)

    Google Scholar 

  7. Millan, J.R.: On the need for on-line learning in brain-computer interfaces. In: Proc. IEEE International Joint Conference on Neural Networks, vol. 4, pp. 2877–2882 (2004)

    Google Scholar 

  8. Desmar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  Google Scholar 

  9. Neal, R.M., Hinton, G.E.: A view of the EM algorithm that justifies incremental, sparse, and other variants. Learning in Graphical Models, 355–368 (1998)

    Google Scholar 

  10. Sato, M., Ishii, S.: On-line EM algorithm for the normalized Gaussian network. Neural Comp. 12(2), 407–432 (2000)

    Article  Google Scholar 

  11. Awwad Shiekh Hasan, B., Gan, J.Q.: Unsupervised adaptive GMM for BCI. In: International IEEE EMBS Conf. on Neural Engineering, Antalya, Turkey (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Awwad Shiekh Hasan, B., Gan, J.Q. (2009). Sequential EM for Unsupervised Adaptive Gaussian Mixture Model Based Classifier. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2009. Lecture Notes in Computer Science(), vol 5632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03070-3_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-03070-3_8

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-03069-7

  • Online ISBN: 978-3-642-03070-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics