Skip to main content

Simultaneous Model Selection and Feature Selection via BYY Harmony Learning

  • Conference paper
Advances in Neural Networks – ISNN 2011 (ISNN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6676))

Included in the following conference series:

  • 2330 Accesses

Abstract

Model selection for Gaussian mixture learning on a given dataset is an important but difficulty task and also depends on the feature or variable selection in practical applications. In this paper, we propose a new kind of learning algorithm for Gaussian mixtures with simultaneous model selection and variable selection (MSFS) based on the BYY harmony learning framework. It is demonstrated by simulation experiments that the proposed MSFS algorithm is able to solve the model selection and feature selection problems of Gaussian mixture learning on a given dataset simultaneously.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. McLachlan, G.J., Peel, D.: Finite Mixture Models. Wiley, New York (2000)

    Book  MATH  Google Scholar 

  2. Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society B 39, 1–38 (1977)

    MathSciNet  MATH  Google Scholar 

  3. Render, R.A., Walker, H.F.: Mixture densities, maximum likelihood and the EM algorithm. SIAM Review 26(2), 195–293 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  4. Hartigan, J.A.: Distribution problems in clustering. In: Garrett, J. (ed.) Classification and Clustering, pp. 45–72. Academic Press, New York (1977)

    Chapter  Google Scholar 

  5. Blum, A., Langley, P.: Selection of Relevant Features and Examples in Machine Learning. Artificial Intelligence 97(1-2), 245–271 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  6. Kohavi, R., John, G.H.: Wrapper for feature subset selection. Articial Intelligence 97, 273–324 (1997)

    Article  MATH  Google Scholar 

  7. Jain, A., Zongker, D.: Feature Selection: Evaluation, Application, and Small Sample Performance. IEEE Trans. Pattern Analysis and Machine Intelligence 19(2), 153–157 (1997)

    Article  Google Scholar 

  8. Koller, D., Sahami, M.: Toward optimal feature selection. In: Proc. 13th Int’l Conf. Machine Learning, pp. 284–292 (1996)

    Google Scholar 

  9. Raudys, S.J., Jain, A.K.: Small sample size effects in statistical pattern recognition: recommandations for practitioners. IEEE Trans. Pattern Analysis and Machine Intelligence 13(3), 252–264 (1991)

    Article  Google Scholar 

  10. Law, M.H.C., Figueiredo, M.A.T., Jain, A.K.: Simultaneous Feature Selection and Clustering Using Mixture Models. IEEE Trans. on Pattern Analysis and Machine Intelligence 26(9), 1154–1166 (2004)

    Article  Google Scholar 

  11. Li, Y., Dong, M., Hua, J.: Simultaneous Localied Feature Selection and Model Detection for Gaussian Mixtures. IEEE Trans. on Pattern Analysis and Machine Intelligence 31(5), 953–960 (2009)

    Article  Google Scholar 

  12. Xu, L.: Best harmony, unified RPCL and automated model selection for unsupervised and supervised learning on Gaussian mixtures, three-layer nets and ME-RBF-SVM models. International Journal of Neural Systems 11, 43–69 (2001)

    Article  Google Scholar 

  13. Xu, L.: BYY harmony learning, structural RPCL, and topological self-organizing on mixture modes. Neural Networks 15, 1231–1237 (2002)

    Article  Google Scholar 

  14. Dash, M., Choi, K., Scheuermann, P., Liu, H.: Feature Selection for Clustering - A Filter Solution. In: Second IEEE International Conference on Data Mining (ICDM 2002), p. 115 (2002)

    Google Scholar 

  15. Jouve, P.-E., Nicoloyannis, N.: A Filter Feature Selection Method for Clustering. In: Hacid, M.-S., Murray, N.V., Raś, Z.W., Tsumoto, S. (eds.) ISMIS 2005. LNCS (LNAI), vol. 3488, pp. 583–593. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  16. Fowlkes, E.B., Gnanadesikan, R., Kettering, J.R.: Variable selection in clustering. Journal of Classification 5, 205–228 (1988)

    Article  MathSciNet  Google Scholar 

  17. Devaney, M., Ram, A.: Eficient feature selection in conceptual clustering. Machine Learning. In: Proceedings of the Fourteenth International Conference, Nashville, TN, pp. 92–97 (1997)

    Google Scholar 

  18. Tadesse, M.G., Sha, N., Vannucci, M.: Bayesian Variable Selection in Clustering High-Dimensional Data. Journal of the American Statistical Association 100, 602–617 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  19. Kim, S., Tadesse, M.G., Vannucci, M.: Variable selection in clustering via Dirichlet process mixture models. Biometrika 93, 321–344 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  20. Akaike, H.: A new look at statistical model identification. IEEE Transactions on Automatic Control AC-19, 716–723 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  21. Scharz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)

    Article  MathSciNet  Google Scholar 

  22. Rissanen, J.: Modeling by shortest data description. Automatica 14, 465–471 (1978)

    Article  MATH  Google Scholar 

  23. Wallace, C., Dowe, D.: Minimum Message Length and Kolmogorov Complexity. Computer Journal 42(4), 270–283 (1999)

    Article  MATH  Google Scholar 

  24. Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. Journal of the American Statistical Association 90(430), 577–588 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  25. Recgardson, S., Green, P.J.: On Bayesian analysis of mixtures with an unknown number of components. Journal of the Royal Statistical Soceity B 59(4), 731–792 (1997)

    Article  MathSciNet  Google Scholar 

  26. Ueda, N., Ghahramani, Z.: Bayesian model search for mixture models based on optimizing variational bounds. Neural Networks 15(10), 1123–1241 (2002)

    Article  Google Scholar 

  27. Constantinopoulos, C., Likas, A.: Unsupervised learning of Gaussian mixtures based on variational component splitting. IEEE Trans. on Neural Networks 18(3), 745–755 (2007)

    Article  Google Scholar 

  28. Figueiredo, M.A.T., Jain, A.K.: Unsupervised learning of finite mixture models. IEEE Trans. on Pattern Analysis and Machine Intelligence 24(3), 381–396 (2002)

    Article  Google Scholar 

  29. Ma, J., Wang, T., Xu, L.: A gradient BYY harmony learning rule on Gaussian mixture with automated model selection. Neurocomputing 56, 481–487 (2004)

    Article  Google Scholar 

  30. Ma, J., Wang, L.: BYY harmony learning on finite mixture: adaptive gradient implementation and a floating RPCL mechanism. Neural Processing Letters 24(1), 19–40 (2006)

    Article  Google Scholar 

  31. Ma, J., Liu, J.: The BYY annealing learning algorithm for Gaussian mixture with automated model selection. Pattern Recognition 40, 2029–2037 (2007)

    Article  MATH  Google Scholar 

  32. Ma, J., He, X.: A fast fixed-point BYY harmony learning algorithm on Gaussian mixture with automated model selection. Pattern Recognition Letters 29(6), 701–711 (2008)

    Article  Google Scholar 

  33. Ma, J.: Automated model selection (AMS) on finite mixtures: a theoretical analysis. In: Proc. 2006 International Joint Conference on Neural Networks (IJCNN 2006), Vancouver, Canada, July 16-21, pp. 8255–8261 (2006)

    Google Scholar 

  34. Ma, J., Wang, T.: A cost-function approach to rival penalized Competitive learning (RPCL). IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics 36(4), 722–737 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, H., Ma, J. (2011). Simultaneous Model Selection and Feature Selection via BYY Harmony Learning. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds) Advances in Neural Networks – ISNN 2011. ISNN 2011. Lecture Notes in Computer Science, vol 6676. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21090-7_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21090-7_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21089-1

  • Online ISBN: 978-3-642-21090-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics