Skip to main content

Smoothed Emphasis for Boosting Ensembles

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7902))

Abstract

Real AdaBoost ensembles have exceptional capabilities for successfully solving classification problems. This characteristic comes from progressively constructing learners paying more attention to samples that are difficult to be classified.

However, the corresponding emphasis can be excessive. In particular, when the problem to solve is asymmetric or includes imbalanced outliers, even the previously proposed modifications of the basic algorithm are not as effective as desired.

In this paper, we introduce a simple modification which uses the neighborhood concept to reduce the above drawbacks. Experimental results confirm the potential of the proposed scheme.

The main conclusions of our work and some suggestions for further research along this line close the paper.

This work has been partly supported by grant TIN 2011-24533 (Spanish MEC).

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)

    Google Scholar 

  2. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  3. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Hoboken (2004)

    Book  Google Scholar 

  4. Rokach, L.: Pattern Classification Using Ensemble Methods. World Scientific, Singapore (2010)

    MATH  Google Scholar 

  5. Schapire, R.E.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)

    Google Scholar 

  6. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Computer and System Sciences 55(1), 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  7. Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3), 297–336 (1999)

    Article  MATH  Google Scholar 

  8. Freund, Y.: An adaptive version of the boost by majority algorithm. Machine Learning 43(3), 293–318 (2001)

    Article  MATH  Google Scholar 

  9. Gómez-Verdejo, V., Ortega-Moral, M., Arenas-García, J., Figueiras-Vidal, A.R.: Boosting by weighting critical and erroneous samples. Neurocomputing 69(7-9), 679–685 (2006)

    Article  Google Scholar 

  10. Gómez-Verdejo, V., Arenas-García, J., Figueiras-Vidal, A.R.: A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans. Neural Networks 19(1), 3–17 (2008)

    Article  Google Scholar 

  11. Rätsch, G., Onoda, T., Müller, K.R.: Regularizing Adaboost. In: Kears, M., Solla, S., Cohn, D. (eds.) Advances in Neural Information Processing Systems, vol. 11, pp. 564–570. Cambridge University Press, Cambridge (1999)

    Google Scholar 

  12. Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42(3), 287–320 (2001)

    Article  MATH  Google Scholar 

  13. Rätsch, G., Warmuth, M.K.: Efficient margin maximizing with boosting. Journal of Machine Learning Research 6, 2131–2152 (2005)

    MATH  Google Scholar 

  14. Sun, Y., Todorovic, S., Li, J.: Reducing the overfitting of AdaBoost by controlling its data distribution skewness. International Journal of Pattern Recognition and Artificial Intelligence 20(7), 1093–1116 (2006)

    Article  Google Scholar 

  15. Shen, C., Li, H.: Boosting through optimization of margin distributions. IEEE Trans. Neural Networks 21(4), 659–666 (2010)

    Article  Google Scholar 

  16. Zhang, C.-X., Zhang, J.-S., Zhang, G.-Y.: An efficient modified boosting method for solving classification problems. J. Computational and Applied Mathematics 214(2), 381–392 (2008)

    Article  MATH  Google Scholar 

  17. Kim, H.C., Pang, S., Je, H.M., Kim, D., Bang, S.Y.: Constructing support vector machine ensemble. Pattern Recognition 26, 2757–2767 (2003)

    Article  Google Scholar 

  18. Li, X., Wang, L., Sung, E.: Adaboost with SVM-based component classifiers. Engineering Applications of Artificial Intelligence 21, 785–795 (2008)

    Article  Google Scholar 

  19. Mayhua-López, E., Gómez-Verdejo, V., Figueiras-Vidal, A.R.: Boosting ensembles with subsampling LPSVM learners. Submitted to IEEE Trans. Neural Networks and Machine Learning

    Google Scholar 

  20. Mayhua-López, E., Gómez-Verdejo, V., Figueiras-Vidal, A.R.: Real Adaboost with gate controlled fusion. IEEE Trans. Neural Networks and Learning Systems 23(12), 2003–2009 (2012)

    Article  Google Scholar 

  21. UCI Machine Learning Repository. School Information & Computer Sciences, Univ. California, Irvine, http://archive.ics.uci.edu/ml

  22. Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)

    MATH  Google Scholar 

  23. Kwok, J.T.Y.: Moderating the outputs of support vector machine classifiers. IEEE Trans. Neural Networks 10(5), 1018–1031 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ahachad, A., Omari, A., Figueiras-Vidal, A.R. (2013). Smoothed Emphasis for Boosting Ensembles. In: Rojas, I., Joya, G., Gabestany, J. (eds) Advances in Computational Intelligence. IWANN 2013. Lecture Notes in Computer Science, vol 7902. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38679-4_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-38679-4_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-38678-7

  • Online ISBN: 978-3-642-38679-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics