Abstract
Real AdaBoost ensembles have exceptional capabilities for successfully solving classification problems. This characteristic comes from progressively constructing learners paying more attention to samples that are difficult to be classified.
However, the corresponding emphasis can be excessive. In particular, when the problem to solve is asymmetric or includes imbalanced outliers, even the previously proposed modifications of the basic algorithm are not as effective as desired.
In this paper, we introduce a simple modification which uses the neighborhood concept to reduce the above drawbacks. Experimental results confirm the potential of the proposed scheme.
The main conclusions of our work and some suggestions for further research along this line close the paper.
This work has been partly supported by grant TIN 2011-24533 (Spanish MEC).
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004)
Rokach, L.: Pattern Classification Using Ensemble Methods. World Scientific, Singapore (2010)
Schapire, R.E.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Computer and System Sciences 55(1), 119–139 (1997)
Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)
Freund, Y.: An adaptive version of the boost by majority algorithm. Machine Learning 43(3), 293–318 (2001)
Gómez-Verdejo, V., Ortega-Moral, M., Arenas-García, J., Figueiras-Vidal, A.R.: Boosting by weighting critical and erroneous samples. Neurocomputing 69(7-9), 679–685 (2006)
Gómez-Verdejo, V., Arenas-García, J., Figueiras-Vidal, A.R.: A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans. Neural Networks 19(1), 3–17 (2008)
Rätsch, G., Onoda, T., Müller, K.R.: Regularizing Adaboost. In: Kears, M., Solla, S., Cohn, D. (eds.) Advances in Neural Information Processing Systems, vol. 11, pp. 564–570. Cambridge University Press, Cambridge (1999)
Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42(3), 287–320 (2001)
Rätsch, G., Warmuth, M.K.: Efficient margin maximizing with boosting. Journal of Machine Learning Research 6, 2131–2152 (2005)
Sun, Y., Todorovic, S., Li, J.: Reducing the overfitting of AdaBoost by controlling its data distribution skewness. International Journal of Pattern Recognition and Artificial Intelligence 20(7), 1093–1116 (2006)
Shen, C., Li, H.: Boosting through optimization of margin distributions. IEEE Trans. Neural Networks 21(4), 659–666 (2010)
Zhang, C.-X., Zhang, J.-S., Zhang, G.-Y.: An efficient modified boosting method for solving classification problems. J. Computational and Applied Mathematics 214(2), 381–392 (2008)
Kim, H.C., Pang, S., Je, H.M., Kim, D., Bang, S.Y.: Constructing support vector machine ensemble. Pattern Recognition 26, 2757–2767 (2003)
Li, X., Wang, L., Sung, E.: Adaboost with SVM-based component classifiers. Engineering Applications of Artificial Intelligence 21, 785–795 (2008)
Mayhua-López, E., Gómez-Verdejo, V., Figueiras-Vidal, A.R.: Boosting ensembles with subsampling LPSVM learners. Submitted to IEEE Trans. Neural Networks and Machine Learning
Mayhua-López, E., Gómez-Verdejo, V., Figueiras-Vidal, A.R.: Real Adaboost with gate controlled fusion. IEEE Trans. Neural Networks and Learning Systems 23(12), 2003–2009 (2012)
UCI Machine Learning Repository. School Information & Computer Sciences, Univ. California, Irvine, http://archive.ics.uci.edu/ml
Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)
Kwok, J.T.Y.: Moderating the outputs of support vector machine classifiers. IEEE Trans. Neural Networks 10(5), 1018–1031 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ahachad, A., Omari, A., Figueiras-Vidal, A.R. (2013). Smoothed Emphasis for Boosting Ensembles. In: Rojas, I., Joya, G., Gabestany, J. (eds) Advances in Computational Intelligence. IWANN 2013. Lecture Notes in Computer Science, vol 7902. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38679-4_36
Download citation
DOI: https://doi.org/10.1007/978-3-642-38679-4_36
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-38678-7
Online ISBN: 978-3-642-38679-4
eBook Packages: Computer ScienceComputer Science (R0)