Skip to main content

Boosting via Approaching Optimal Margin Distribution

  • Conference paper
  • First Online:
  • 3513 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 9077))

Abstract

Margin distribution is crucial to AdaBoost. In this paper, we propose a new boosting method by utilizing the Emargin bound to approach the optimal margin distribution. We first define the \(k^*\)-optimization margin distribution, which has a sharper Emargin bound than that of AdaBoost. Then we present two boosting algorithms, KM-Boosting and MD-Boosting, both of which approximately approach the \(k^*\)-optimization margin distribution using the relation between the \(k\)th margin bound and the Emargin bound. Finally, we show that boosting on the \(k^*\)-optimization margin distribution is sound and efficient. Especially, MD-Boosting almost surely has a sharper bound than that of AdaBoost, and just needs a little more computational cost than that of AdaBoost, which means that MD-Boosting is effective in redundancy reduction without losing much accuracy.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.K.: Occam’s razor. Information Processing Letters 24(6), 377–380 (1987)

    Article  MATH  MathSciNet  Google Scholar 

  2. Breiman, L.: Prediction games and arcing algorithms. Neural Computation 11(7), 1493–1517 (1999)

    Article  Google Scholar 

  3. Bühlmann, P.L.: Consistency for L2Boosting and matching pursuit with trees and tree-type basis functions. Tech. Rep. 109, ETH Zürich (2002)

    Google Scholar 

  4. Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40(2), 139–157 (2000)

    Article  Google Scholar 

  5. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Proceedings of the 13th International Conference on Machine Learning, vol. 96, pp. 148–156 (1996)

    Google Scholar 

  6. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  7. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. Annals of Statistics 28(2), 337–407 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  8. Gao, W., Zhou, Z.H.: On the doubt about margin explanation of boosting. Artificial Intelligence 203, 1–18 (2013)

    Article  MathSciNet  Google Scholar 

  9. Grove, A.J., Schuurmans, D.: Boosting in the limit: maximizing the margin of learned ensembles. In: Proceedings of the 15th National Conference on Artificial Intelligence, pp. 692–699 (1998)

    Google Scholar 

  10. Jiang, W.: Process consistency for Adaboost. Annals of Statistics 32(1), 13–29 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  11. Koltchinskii, V., Panchenko, D.: Empirical margin distributions and bounding the generalization error of combined classifiers. Annals of Statistics 30(1), 1–50 (2002)

    MATH  MathSciNet  Google Scholar 

  12. Koltchinskii, V., Panchenko, D.: Complexities of convex combinations and bounding the generalization error in classification. Annals of Statistics 33(4), 1455–1496 (2005)

    Article  MATH  MathSciNet  Google Scholar 

  13. Mason, L., Baxter, J., Bartlett, P., Frean, M.: Boosting algorithms as gradient descent. In: Advances in Neural Information Processing Systems, pp. 512–518 (1999)

    Google Scholar 

  14. Rätsch, G., Onoda, T., Müller, K.R.: Soft margins for AdaBoost. Machine Learning 42(3), 287–320 (2001)

    Article  MATH  Google Scholar 

  15. Reyzin, L., Schapire, R.E.: How boosting the margin can also boost classifier complexity. In: Proceedings of the 23rd International Conference on Machine learning, pp. 753–760 (2006)

    Google Scholar 

  16. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. Annals of Statistics 26(5), 1651–1686 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  17. Shen, C., Li, H.: Boosting through optimization of margin distributions. IEEE Transactions on Neural Networks 21(4), 659–666 (2010)

    Article  Google Scholar 

  18. Shen, C., Li, H.: On the dual formulation of boosting algorithms. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(12), 2216–2231 (2010)

    Article  MathSciNet  Google Scholar 

  19. Viola, P.A., Jones, M.J.: Rapid object detection using a boosted cascade of simple features. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 511–518 (2001)

    Google Scholar 

  20. Wang, L., Deng, X., Jing, Z., Feng, J.: Further results on the margin explanation of boosting: New algorithm and experiments. Science China Information Sciences 55(7), 1551–1562 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  21. Wang, L., Sugiyama, M., Jing, Z., Yang, C., Zhou, Z.H., Feng, J.: A refined margin analysis for boosting algorithms via equilibrium margin. Journal of Machine Learning Research 12, 1835–1863 (2011)

    MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shizhong Liao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Liu, C., Liao, S. (2015). Boosting via Approaching Optimal Margin Distribution. In: Cao, T., Lim, EP., Zhou, ZH., Ho, TB., Cheung, D., Motoda, H. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2015. Lecture Notes in Computer Science(), vol 9077. Springer, Cham. https://doi.org/10.1007/978-3-319-18038-0_53

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-18038-0_53

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-18037-3

  • Online ISBN: 978-3-319-18038-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics