Skip to main content

Margin-Based Ranking Meets Boosting in the Middle

  • Conference paper
Learning Theory (COLT 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3559))

Included in the following conference series:

Abstract

We present several results related to ranking. We give a general margin-based bound for ranking based on the L  ∞  covering number of the hypothesis space. Our bound suggests that algorithms that maximize the ranking margin generalize well.

We then describe a new algorithm, Smooth Margin Ranking, that precisely converges to a maximum ranking-margin solution. The algorithm is a modification of RankBoost, analogous to Approximate Coordinate Ascent Boosting.

We also prove a remarkable property of AdaBoost: under very natural conditions, AdaBoost maximizes the exponentiated loss associated with the AUC and achieves the same AUC as RankBoost. This explains the empirical observations made by Cortes and Mohri, and Caruana and Niculescu-Mizil, about the excellent performance of AdaBoost as a ranking algorithm, as measured by the AUC.

This work is partially supported by NSF grant CCR-0325463.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Agarwal, S., Graepel, T., Herbich, R., Har-Peled, S., Roth, D.: Generalization bounds for the area under the ROC curve. Journal of Machine Learning Research 6, 393–425 (2005)

    MATH  Google Scholar 

  2. Bousquet, O.: New approaches to statistical learning theory. Annals of the Institute of Statistical Mathematics 55(2), 371–389 (2003)

    MATH  MathSciNet  Google Scholar 

  3. Caruana, R., Niculescu-Mizil An, A.: empirical comparison of supervised learning algorithms using difference performance metrics. Technical Report TR2005-1973, Cornell University (2005)

    Google Scholar 

  4. Collins, M., Schapire, R.E., Singer, Y.: Logistic regression, AdaBoost and Bregman distances. Machine Learning 48(1/2/3) (2002)

    Google Scholar 

  5. Cortes, C., Mohri, M.: AUC optimization vs. error rate minimization. Advances in Neural Information Processing Systems 16 (2004)

    Google Scholar 

  6. Cucker, F., Smale, S.: On the mathematical foundations of learning. Bull. Amer. Math. Soc. (39), 1–49 (2002)

    Google Scholar 

  7. Freund, Y., Iyer, R., Schapire, R.E., Singer, Y.: An efficient boosting algorithm for combining preferences. Machine Learning: Proceedings of the Fifteenth International Conference (1998)

    Google Scholar 

  8. Freund, Y., Schapire, R.E.: Adaptive game playing using multiplicative weights. Games and Economic Behavior 29, 79–103 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  9. Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. The Annals of Statistics 38(2), 337–374 (2000)

    Article  MathSciNet  Google Scholar 

  10. Koltchinskii, V., Panchenko, D.: Empirical margin distributions and bounding the generalization error of combined classifiers. The Annals of Statistics 30(1) (February 2002)

    Google Scholar 

  11. Lafferty, J.D., Pietra, S.D., Pietra, V.D.: Statistical learning algorithms based on Bregman distances. In: Proceedings of the Canadian Workshop on Information Theory (1997)

    Google Scholar 

  12. Rudin, C., Daubechies, I., Schapire, R.E.: The dynamics of AdaBoost: Cyclic behavior and convergence of margins. Journal of Machine Learning Research 5, 1557–1595 (2004)

    MathSciNet  Google Scholar 

  13. Rudin, C., Schapire, R.E., Daubechies, I.: Analysis of boosting algorithms using the smooth margin function: A study of three algorithms. Submitted (2004)

    Google Scholar 

  14. Rudin, C., Schapire, R.E., Daubechies, I.: Boosting based on a smooth margin. In: Proceedings of the Sixteenth Annual Conference on Computational Learning Theory, pp. 502–517 (2004)

    Google Scholar 

  15. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics 26(5), 1651–1686 (1998)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Rudin, C., Cortes, C., Mohri, M., Schapire, R.E. (2005). Margin-Based Ranking Meets Boosting in the Middle. In: Auer, P., Meir, R. (eds) Learning Theory. COLT 2005. Lecture Notes in Computer Science(), vol 3559. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11503415_5

Download citation

  • DOI: https://doi.org/10.1007/11503415_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26556-6

  • Online ISBN: 978-3-540-31892-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics