Abstract
This paper presents recent results in extending the well known Machine Learning ensemble method, boosting. The main idea is to vary the “weak” base classifier with each step of the method, using a classifier which performs “best” on the data presented in that iteration. We show that the solution is sensitive to the loss function used, and that the exponential loss function is detrimental to the performance of this kind of boosting. An approach which uses a logistic loss function performs better, but tends to overfit with a growing number of iterations. We show that this drawback can be overcome with the use of resampling technique, taken from the research on learning from imbalanced data.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Botta, M.: Resampling vs Reweighting in Boosting a Relational Weak Learner. In: Esposito, F. (ed.) AI*IA 2001. LNCS (LNAI), vol. 2175, pp. 70–80. Springer, Heidelberg (2001)
Collins, M., Schapire, R.E., Singer, Y.: Logistic regression, adaboost and bregman distances. Machine Learning 48, 253–285 (2002)
de Souza, É.N., Matwin, S.: Extending AdaBoost to Iteratively Vary Its Base Classifiers. In: Butz, C., Lingras, P. (eds.) Canadian AI 2011. LNCS, vol. 6657, pp. 384–389. Springer, Heidelberg (2011)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Saitta, L. (ed.) Proc. of the 13th International Conf. on ML, pp. 148–156 (1996)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. of Comp. and System Sc. 55(1), 119–139 (1997)
Golestani, A., Ahmadian, K., Amiri, A., JahedMotlagh, M.R.: A novel adaptive-boost-based strategy for combining classifiers using diversity concept. In: ACIS International Conf. on Comp. and Info. Science, pp. 128–134 (2007)
Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent. In: Advances in Neural Information Processing Systems, vol. 12, pp. 512–518 (2000)
Webb, A.R.: Statistical Pattern Recognition, 2nd edn. John Wiley & Sons (October 2002)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
de Souza, E.N., Matwin, S. (2012). Improvements to AdaBoost Dynamic. In: Kosseim, L., Inkpen, D. (eds) Advances in Artificial Intelligence. Canadian AI 2012. Lecture Notes in Computer Science(), vol 7310. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30353-1_26
Download citation
DOI: https://doi.org/10.1007/978-3-642-30353-1_26
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-30352-4
Online ISBN: 978-3-642-30353-1
eBook Packages: Computer ScienceComputer Science (R0)