Abstract
This paper presents a new algorithm for unsupervised incremental learning based on a Bayesian framework. The algorithm, called IGMM (for Incremental Gaussian Mixture Model), creates and continually adjusts a Gaussian Mixture Model consistent to all sequentially presented data. IGMM is particularly useful for on-line incremental clustering of data streams, as encountered in the domain of mobile robotics and animats. It creates an incremental knowledge model of the domain consisting of primitive concepts involving all observed variables. We present some preliminary results obtained using synthetic data and also consider practical issues as convergence properties discuss future developments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Arandjelovic, O., Cipolla, R.: Incremental learning of temporally-coherent gaussian mixture models. In: Proc. 16th British Machine Vision Conf. (BMVC), Oxford, UK, pp. 759–768 (2005)
Kristan, M., Skocaj, D., Leonardis, A.: Incremental learning with gaussian mixture models. In: Proc. Computer Vision Winter Workshop, Moravske Toplice, Slovenia, pp. 25–32 (2008)
Fisher, D.H.: Knowledge acquisition via incremental conceptual learning. Machine Learning 2, 139–172 (1987)
Gennari, J.H., Langley, P., Fisher, D.: Models of incremental concept formation. Artificial Intelligence 40, 11–61 (1989)
MacQueen, J.B.: Some methods for classification and analysis of multivariate observations. In: Proc. 5th Berkeley Symposium on Mathematical Statistics and Probability, vol. 1, pp. 281–297. Univ. California Press, Berkeley (1967)
Tan, P.N., Steinbach, M., Kumar, V.: Introduction to Data Mining. Addison-Wesley, Boston (2006)
Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society 39(1), 1–38 (1977)
Robbins, H., Monro, S.: A stochastic approximation method. Annals of Mathematical Statistics 22, 400–407 (1951)
Titterington, D.M.: Recursive parameter estimation using incomplete data. Journal of the Royal Statistical Society 46(2), 257–267 (1984)
Wang, S., Zhao, Y.: Almost sure convergence of titterington’s recursive estimator for mixture models. Statistics & Probability Letters (76) (2001–2006)
Neal, R.M., Hinton, G.E.: A view of the EM algorithm that justifies incremental, sparse, and other variants. In: Learning in Graphical Models, pp. 355–368. Kluwer Academic Publishers, Dordrecht (1998)
Sato, M.A., Ishii, S.: On-line EM algorithm for the normalized gaussian network. Neural Computation 12(2), 407–432 (2000)
Cappé, O., Moulines, E.: Recursive EM algorithm with applications to DOA estimation. In: Proc. IEEE Int. Conf. Acoustics, Speech, and Signal Processing, Toulouse, France (2006)
Cappé, O., Moulines, E.: Online EM algorithm for latent data models. Journal of the Royal Statistical Society (2008)
Keehn, D.G.: A note on learning for gaussian proprieties. IEEE Trans. Information Theory 11, 126–132 (1965)
Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, London (1990)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Engel, P.M., Heinen, M.R. (2010). Incremental Learning of Multivariate Gaussian Mixture Models. In: da Rocha Costa, A.C., Vicari, R.M., Tonidandel, F. (eds) Advances in Artificial Intelligence – SBIA 2010. SBIA 2010. Lecture Notes in Computer Science(), vol 6404. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-16138-4_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-16138-4_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-16137-7
Online ISBN: 978-3-642-16138-4
eBook Packages: Computer ScienceComputer Science (R0)