Abstract
An ensemble of classifiers (EoC) has been shown to be effective in improving classifier performance. To optimize EoC, the ensemble selection is one of the most imporatant issues. Dynamic scheme urges the use of different ensembles for different samples, but it has been shown that dynamic selection does not give better performance than static selection. We propose a dynamic selection scheme which explores the property of the oracle concept. The result suggests that the proposed scheme is apparently better than the selection based on popular majority voting error.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Brown, G., et al.: Diversity Creation Methods: A Survey and Categorisation. International Journal of Information Fusion 6(1), 5–20 (2005)
Didaci, L., et al.: A study on the performances of dynamic classifier selection based on local accuracy estimation. Pattern Recognition 38(11), 2188–2191 (2005)
Didaci, L., Giacinto, G.: Dynamic Classifier Selection by Adaptive k-Nearest-Neighbourhood Rule. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 174–183. Springer, Heidelberg (2004)
Giacinto, G., Roli, F.: Methods for Dynamic Classifier Selection. In: International Conference on Image Analysis and Processing (ICIAP 1999), pp. 659–664 (1999)
Hastie, T., Tibshirani, R.: Discriminant Adaptive Nearest Neighbor Classification. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(6), 607–616 (1996)
Ho, T.K.: The random space method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)
Ho, T.K., Hull, J.J., Srihari, S.N.: Decision combination in multiple classifier systems. IEEE Transactions on Pattern Analysis and Machine Intelligence 16(1), 66–75 (1994)
Kittler, J., et al.: On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3), 226–239 (1998)
Ko, A.H.R., Sabourin, R., Britto Jr., A.: Combining Diversity and Classification Accuracy for Ensemble Selection in Random Subspaces. In: IEEE World Congress on Computational Intelligence (WCCI 2006) - International Joint Conference on Neural Networks (IJCNN 2006) (2006)
Kuncheva, L.I., Skurichina, M., Duin, R.P.W.: An Experimental Study on Diversity for Bagging and Boosting with Linear Classifiers. International Journal of Information Fusion 3(2), 245–258 (2002)
Kuncheva, L.I., Whitaker, C.J.: Measures of Diversity in Classifier Ensembles and Their Relationship with the Ensemble Accuracy. Machine Learning 51(2), 181–207 (2003)
Ruta, D., Gabrys, B.: Classifier Selection for Majority Voting. International Journal of Information Fusion, 63–81 (2005)
Tax, D.M.J., et al.: Combining Multiple Classifiers by Averaging or by Multiplying. Pattern Recognition 33(9), 1475–1485 (2000)
Tremblay, G., Sabourin, R., Maupin, P.: Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm. In: Proceedings of the 17th International Conference on Pattern Recognition (ICPR 2004), pp. 208–211 (2004)
Ruta, D., Gabrys, B.: Analysis of the Correlation between Majority Voting Error and the Diversity Measures in Multiple Classifier Systems. In: Proceedings of the 4th International Symposium on Soft Computing (2001)
Woods, K., Kegelmeyer Jr., W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(4), 405–410 (1997)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Ko, A.HR., Sabourin, R., de Souza Britto, A. (2007). A New Dynamic Ensemble Selection Method for Numeral Recognition. In: Haindl, M., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2007. Lecture Notes in Computer Science, vol 4472. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72523-7_43
Download citation
DOI: https://doi.org/10.1007/978-3-540-72523-7_43
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72481-0
Online ISBN: 978-3-540-72523-7
eBook Packages: Computer ScienceComputer Science (R0)