Skip to main content

A New Dynamic Ensemble Selection Method for Numeral Recognition

  • Conference paper
Multiple Classifier Systems (MCS 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4472))

Included in the following conference series:

Abstract

An ensemble of classifiers (EoC) has been shown to be effective in improving classifier performance. To optimize EoC, the ensemble selection is one of the most imporatant issues. Dynamic scheme urges the use of different ensembles for different samples, but it has been shown that dynamic selection does not give better performance than static selection. We propose a dynamic selection scheme which explores the property of the oracle concept. The result suggests that the proposed scheme is apparently better than the selection based on popular majority voting error.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Brown, G., et al.: Diversity Creation Methods: A Survey and Categorisation. International Journal of Information Fusion 6(1), 5–20 (2005)

    Article  Google Scholar 

  2. Didaci, L., et al.: A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognition 38(11), 2188–2191 (2005)

    Article  MATH  Google Scholar 

  3. Didaci, L., Giacinto, G.: Dynamic Classifier Selection by Adaptive k-Nearest-Neighbourhood Rule. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 174–183. Springer, Heidelberg (2004)

    Google Scholar 

  4. Giacinto, G., Roli, F.: Methods for Dynamic Classifier Selection. In: International Conference on Image Analysis and Processing (ICIAP 1999), pp. 659–664 (1999)

    Google Scholar 

  5. Hastie, T., Tibshirani, R.: Discriminant Adaptive Nearest Neighbor Classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(6), 607–616 (1996)

    Article  Google Scholar 

  6. Ho, T.K.: The random space method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  7. Ho, T.K., Hull, J.J., Srihari, S.N.: Decision combination in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence 16(1), 66–75 (1994)

    Article  Google Scholar 

  8. Kittler, J., et al.: On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3), 226–239 (1998)

    Article  Google Scholar 

  9. Ko, A.H.R., Sabourin, R., Britto Jr., A.: Combining Diversity and Classification Accuracy for Ensemble Selection in Random Subspaces. In: IEEE World Congress on Computational Intelligence (WCCI 2006) - International Joint Conference on Neural Networks (IJCNN 2006) (2006)

    Google Scholar 

  10. Kuncheva, L.I., Skurichina, M., Duin, R.P.W.: An Experimental Study on Diversity for Bagging and Boosting with Linear Classifiers. International Journal of Information Fusion 3(2), 245–258 (2002)

    Article  Google Scholar 

  11. Kuncheva, L.I., Whitaker, C.J.: Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy. Machine Learning 51(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  12. Ruta, D., Gabrys, B.: Classifier Selection for Majority Voting. International Journal of Information Fusion, 63–81 (2005)

    Google Scholar 

  13. Tax, D.M.J., et al.: Combining Multiple Classifiers by Averaging or by Multiplying. Pattern Recognition 33(9), 1475–1485 (2000)

    Article  Google Scholar 

  14. Tremblay, G., Sabourin, R., Maupin, P.: Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm. In: Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), pp. 208–211 (2004)

    Google Scholar 

  15. Ruta, D., Gabrys, B.: Analysis of the Correlation between Majority Voting Error and the Diversity Measures in Multiple Classifier Systems. In: Proceedings of the 4th International Symposium on Soft Computing (2001)

    Google Scholar 

  16. Woods, K., Kegelmeyer Jr., W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4), 405–410 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Michal Haindl Josef Kittler Fabio Roli

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer Berlin Heidelberg

About this paper

Cite this paper

Ko, A.HR., Sabourin, R., de Souza Britto, A. (2007). A New Dynamic Ensemble Selection Method for Numeral Recognition. In: Haindl, M., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2007. Lecture Notes in Computer Science, vol 4472. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72523-7_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-72523-7_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-72481-0

  • Online ISBN: 978-3-540-72523-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics