Skip to main content

Fitting Gaussian Mixture Models Using Cooperative Particle Swarm Optimization

  • Conference paper
  • First Online:
Swarm Intelligence (ANTS 2020)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 12421))

Included in the following conference series:

Abstract

Recently, a particle swarm optimization (PSO) algorithm was used to fit a Gaussian mixture model (GMM). However, this algorithm incorporates an additional step in the optimization process which increases the algorithm complexity and scales badly to a large number of components and large datasets. This study proposes a cooperative approach to improve the scalability and complexity of the PSO approach and illustrates its effectiveness compared to the expectation-maximization (EM) algorithm and the existing PSO approach when applied to a number of clustering problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Also referred to as components.

  2. 2.

    The angle of a rotation in a plane described by two axis.

  3. 3.

    For D dimensions there are \(\tau \) planes wherein a rotation can be applied.

  4. 4.

    A matrix which applies a Givens rotation.

  5. 5.

    A GMM assuming K components.

  6. 6.

    Dua, D. and Graff, C. (2019). UCI Machine Learning Repository [http://archive.ics.uci.edu/ml]. Irvine, CA: University of California, School of Information and Computer Science.

  7. 7.

    U(a, b) denotes a uniform distribution over the interval (a, b).

References

  1. Ari, C., Aksoy, S.: Maximum likelihood estimation of Gaussian mixture models using particle swarm optimization. In: Proceedings of the International Conference on Pattern Recognition, pp. 746–749 (2010)

    Google Scholar 

  2. Blömer, J., Bujna, K.: Adaptive seeding for Gaussian mixture models. In: Bailey, J., Khan, L., Washio, T., Dobbie, G., Huang, J.Z., Wang, R. (eds.) PAKDD 2016. LNCS (LNAI), vol. 9652, pp. 296–308. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-31750-2_24

    Chapter  Google Scholar 

  3. Dasgupta, S.: Learning mixture of Gaussians. In: Proceedings of the 40th Annual Symposium on Foundations of Computer Science, pp. 634–644 (1999)

    Google Scholar 

  4. Duffin, K., Barrett, W.: Spiders: a new user interface for rotation and visualization of n-dimensional point sets. In: Proceedings of the 1994 IEEE Conference on Scientific Visualization, pp. 205–211 (1994)

    Google Scholar 

  5. Engelbrecht, A.P., Van den Bergh, F.: A cooperative approach to particle swarm optimization. Proc. IEEE Trans. Evol. Comput. 8, 225–239 (2004)

    Article  Google Scholar 

  6. Gensler, S.: Finite mixture models. In: Homburg, C., Klarmann, M., Vomberg, A. (eds.) Handbook of Market Research. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-05542-8_12-1

    Chapter  Google Scholar 

  7. Georgieva, K.: A Computational Intelligence Approach to Clustering of Temporal Data. Master’s thesis, University of Pretoria, South Africa (2015)

    Google Scholar 

  8. Georgieva, K.S., Engelbrecht, A.P.: Dynamic differential evolution algorithm for clustering temporal data. In: Lirkov, I., Margenov, S., Waśniewski, J. (eds.) LSSC 2013. LNCS, vol. 8353, pp. 240–247. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-662-43880-0_26

    Chapter  Google Scholar 

  9. Redner, R., Walker, H.: Mixture densities, maximum likelihood and the EM algorithm. SIAM Rev. 26, 195–239 (1984)

    Article  MathSciNet  Google Scholar 

  10. Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: Proceedings of the 1998 IEEE International Conference on Evolutionary Computation, pp. 69–73 (1998)

    Google Scholar 

  11. Shi, Y., Eberhart, R.: Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 Congress on Evolutionary Computation, vol. 1, pp. 84–88 (2000)

    Google Scholar 

  12. Xie, X., Beni, G.: A validity measure for fuzzy clustering. IEEE Trans. Pattern Anal. Mach. Intell. 13, 841–847 (1991)

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank the Centre for High Performance Computing for the use of their resources to run the simulations used in this study. As well as the UCI Machine Learning Repository for the real-world datasets used.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andries P. Engelbrecht .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cilliers, H., Engelbrecht, A.P. (2020). Fitting Gaussian Mixture Models Using Cooperative Particle Swarm Optimization. In: Dorigo, M., et al. Swarm Intelligence. ANTS 2020. Lecture Notes in Computer Science(), vol 12421. Springer, Cham. https://doi.org/10.1007/978-3-030-60376-2_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-60376-2_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-60375-5

  • Online ISBN: 978-3-030-60376-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics