Abstract
How to effectively combine the outputs of base classifiers is one of the key issues in ensemble learning. A new dynamic ensemble selection algorithm is proposed in this paper. In order to predict a sample, the base classifiers whose classification confidences on this sample are greater than or equal to specified threshold value are selected. Since margin is an important factor to the generalization performance of voting classifiers, thus the threshold value is estimated via the minimization of margin loss. We analyze the proposed algorithm in detail and compare it with some other multiple classifiers fusion algorithms. The experimental results validate the effectiveness of our algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)
Blake, C., Keogh, E., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. Inf. Comput. Sci., Univ. California, Irvine, CA, http://archive.ics.uci.edu/ml/
Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55, 119–139 (1997)
Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: A Statistical View of Boosting. Annals of Statistics 28, 337–407 (2000)
Fagundes, D., Canuto, A.: Applying weights in the functioning of the dynamic classifier selection method. In: Proceedings of the Ninth Brazilian Symposium on Neural Networks, pp. 23–27 (2006)
Gilad-Bachrach, R., Navot, A., Tishby, N.: Margin based feature selection-theory and algorithms. In: Proceedings of the Twenty-First International Conference on Machine Learning, pp. 43–50. ACM (2004)
Giacinto, G., Roli, F.: Dynamic classifier selection based on multiple classifier behaviour. Pattern Recognition 34, 1879–1881 (2001)
Ko, A.H.R., Sabourin, R., Britto Jr., A.S.B.: From dynamic classifier selection to dynamic ensemble selection. Pattern Recognition 41, 1735–1748 (2008)
MartÃnez-Muñoz, G., Hernandez-Lobato, D., Suarez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 245–259 (2009)
Margineantu, D.D., Dietterich, T.G.: Pruning Adaptive Boosting. In: Proceedings of the 14th International Conference on Machine Learning, pp. 211–218 (1997)
Nemenyi, P.B.: Distribution-free multiple comparisons. PhD thesis, Princeton University (1963)
RodrÃguez, J.J., Kuncheva, L.I.: Rotation Forest: A New Classifier Ensemble Method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1619–1630 (2006)
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics 26, 1651–1686 (1998)
Santana, A., Soares, R.G.F., Canuto, A.M.P., Souto, M.C.P.: A dynamic classifier selection method to build ensembles using accuracy and diversity. In: Proceedings of the Ninth Brazilian Symposium on Neural Networks (SBRN), pp. 36–41 (2006)
Shin, H.W., Sohn, S.Y.: Selected tree classifier combination based on both accuracy and error diversity. Pattern Recognition 38, 191–197 (2005)
Shawe-Taylor, J., Cristianini, N.: Margin Distribution Bounds on Generalization. In: Fischer, P., Simon, H.U. (eds.) EuroCOLT 1999. LNCS (LNAI), vol. 1572, pp. 263–273. Springer, Heidelberg (1999)
Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Transactions on Pattern Analysis and Machine Intelligence 19, 405–410 (1997)
Wilson, D.R., Martinez, T.R.: Improved heterogeneous distance functions. Journal of Artificial Intelligence Research 6, 1–34 (1997)
Wang, L.W., Sugiyama, M., Jing, Z.X., Yang, C., Zhou, Z.H., Feng, J.F.: A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin. Journal of Machine Learning Research 12, 1835–1863 (2011)
Xiao, J., He, C.Z., Jiang, X.Y., Liu, D.H.: A dynamic classifier ensemble selection approach for noise data. Information Sciences 180, 3402–3421 (2010)
Zhou, Z.H., Wu, J.X., Tang, W.: Ensembling neural networks: many could be better than all. Artificial Intelligence 137, 239–263 (2002)
Zhou, Z.H., Yu, Y.: Ensembling Local Learners Through Multimodal Perturbation. IEEE Transactions on Systems, Man and Cybernetics, Part B: Cybernetics 35, 725–735 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, L., Hu, Q., Wu, X., Yu, D. (2013). Exploring Margin for Dynamic Ensemble Selection. In: Lingras, P., Wolski, M., Cornelis, C., Mitra, S., Wasilewski, P. (eds) Rough Sets and Knowledge Technology. RSKT 2013. Lecture Notes in Computer Science(), vol 8171. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41299-8_17
Download citation
DOI: https://doi.org/10.1007/978-3-642-41299-8_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-41298-1
Online ISBN: 978-3-642-41299-8
eBook Packages: Computer ScienceComputer Science (R0)