Skip to main content

Exploring Margin for Dynamic Ensemble Selection

  • Conference paper
Rough Sets and Knowledge Technology (RSKT 2013)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8171))

Included in the following conference series:

Abstract

How to effectively combine the outputs of base classifiers is one of the key issues in ensemble learning. A new dynamic ensemble selection algorithm is proposed in this paper. In order to predict a sample, the base classifiers whose classification confidences on this sample are greater than or equal to specified threshold value are selected. Since margin is an important factor to the generalization performance of voting classifiers, thus the threshold value is estimated via the minimization of margin loss. We analyze the proposed algorithm in detail and compare it with some other multiple classifiers fusion algorithms. The experimental results validate the effectiveness of our algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  2. Blake, C., Keogh, E., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. Inf. Comput. Sci., Univ. California, Irvine, CA, http://archive.ics.uci.edu/ml/

  3. Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  4. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  5. Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: A Statistical View of Boosting. Annals of Statistics 28, 337–407 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  6. Fagundes, D., Canuto, A.: Applying weights in the functioning of the dynamic classifier selection method. In: Proceedings of the Ninth Brazilian Symposium on Neural Networks, pp. 23–27 (2006)

    Google Scholar 

  7. Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms. In: Proceedings of the Twenty-First International Conference on Machine Learning, pp. 43–50. ACM (2004)

    Google Scholar 

  8. Giacinto, G., Roli, F.: Dynamic classifier selection based on multiple classifier behaviour. Pattern Recognition 34, 1879–1881 (2001)

    Article  MATH  Google Scholar 

  9. Ko, A.H.R., Sabourin, R., Britto Jr., A.S.B.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition 41, 1735–1748 (2008)

    Article  Google Scholar 

  10. Martínez-Muñoz, G., Hernandez-Lobato, D., Suarez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 245–259 (2009)

    Google Scholar 

  11. Margineantu, D.D., Dietterich, T.G.: Pruning Adaptive Boosting. In: Proceedings of the 14th International Conference on Machine Learning, pp. 211–218 (1997)

    Google Scholar 

  12. Nemenyi, P.B.: Distribution-free multiple comparisons. PhD thesis, Princeton University (1963)

    Google Scholar 

  13. Rodríguez, J.J., Kuncheva, L.I.: Rotation Forest: A New Classifier Ensemble Method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1619–1630 (2006)

    Article  Google Scholar 

  14. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics 26, 1651–1686 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  15. Santana, A., Soares, R.G.F., Canuto, A.M.P., Souto, M.C.P.: A dynamic classifier selection method to build ensembles using accuracy and diversity. In: Proceedings of the Ninth Brazilian Symposium on Neural Networks (SBRN), pp. 36–41 (2006)

    Google Scholar 

  16. Shin, H.W., Sohn, S.Y.: Selected tree classifier combination based on both accuracy and error diversity. Pattern Recognition 38, 191–197 (2005)

    MATH  Google Scholar 

  17. Shawe-Taylor, J., Cristianini, N.: Margin Distribution Bounds on Generalization. In: Fischer, P., Simon, H.U. (eds.) EuroCOLT 1999. LNCS (LNAI), vol. 1572, pp. 263–273. Springer, Heidelberg (1999)

    Chapter  Google Scholar 

  18. Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 405–410 (1997)

    Article  Google Scholar 

  19. Wilson, D.R., Martinez, T.R.: Improved heterogeneous distance functions. Journal of Artificial Intelligence Research 6, 1–34 (1997)

    MathSciNet  MATH  Google Scholar 

  20. Wang, L.W., Sugiyama, M., Jing, Z.X., Yang, C., Zhou, Z.H., Feng, J.F.: A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin. Journal of Machine Learning Research 12, 1835–1863 (2011)

    MathSciNet  Google Scholar 

  21. Xiao, J., He, C.Z., Jiang, X.Y., Liu, D.H.: A dynamic classifier ensemble selection approach for noise data. Information Sciences 180, 3402–3421 (2010)

    Article  Google Scholar 

  22. Zhou, Z.H., Wu, J.X., Tang, W.: Ensembling neural networks: many could be better than all. Artificial Intelligence 137, 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  23. Zhou, Z.H., Yu, Y.: Ensembling Local Learners Through Multimodal Perturbation. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics 35, 725–735 (2005)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, L., Hu, Q., Wu, X., Yu, D. (2013). Exploring Margin for Dynamic Ensemble Selection. In: Lingras, P., Wolski, M., Cornelis, C., Mitra, S., Wasilewski, P. (eds) Rough Sets and Knowledge Technology. RSKT 2013. Lecture Notes in Computer Science(), vol 8171. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41299-8_17

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41299-8_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41298-1

  • Online ISBN: 978-3-642-41299-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics