Skip to main content

Improvements to AdaBoost Dynamic

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 7310))

Abstract

This paper presents recent results in extending the well known Machine Learning ensemble method, boosting. The main idea is to vary the “weak” base classifier with each step of the method, using a classifier which performs “best” on the data presented in that iteration. We show that the solution is sensitive to the loss function used, and that the exponential loss function is detrimental to the performance of this kind of boosting. An approach which uses a logistic loss function performs better, but tends to overfit with a growing number of iterations. We show that this drawback can be overcome with the use of resampling technique, taken from the research on learning from imbalanced data.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Botta, M.: Resampling vs Reweighting in Boosting a Relational Weak Learner. In: Esposito, F. (ed.) AI*IA 2001. LNCS (LNAI), vol. 2175, pp. 70–80. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  2. Collins, M., Schapire, R.E., Singer, Y.: Logistic regression, adaboost and bregman distances. Machine Learning 48, 253–285 (2002)

    Article  MATH  Google Scholar 

  3. de Souza, É.N., Matwin, S.: Extending AdaBoost to Iteratively Vary Its Base Classifiers. In: Butz, C., Lingras, P. (eds.) Canadian AI 2011. LNCS, vol. 6657, pp. 384–389. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  4. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Saitta, L. (ed.) Proc. of the 13th International Conf. on ML, pp. 148–156 (1996)

    Google Scholar 

  5. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. of Comp. and System Sc. 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  6. Golestani, A., Ahmadian, K., Amiri, A., JahedMotlagh, M.R.: A novel adaptive-boost-based strategy for combining classifiers using diversity concept. In: ACIS International Conf. on Comp. and Info. Science, pp. 128–134 (2007)

    Google Scholar 

  7. Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent. In: Advances in Neural Information Processing Systems, vol. 12, pp. 512–518 (2000)

    Google Scholar 

  8. Webb, A.R.: Statistical Pattern Recognition, 2nd edn. John Wiley & Sons (October 2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

de Souza, E.N., Matwin, S. (2012). Improvements to AdaBoost Dynamic. In: Kosseim, L., Inkpen, D. (eds) Advances in Artificial Intelligence. Canadian AI 2012. Lecture Notes in Computer Science(), vol 7310. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-30353-1_26

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-30353-1_26

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-30352-4

  • Online ISBN: 978-3-642-30353-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics