Abstract
We present several results related to ranking. We give a general margin-based bound for ranking based on the L ∞ covering number of the hypothesis space. Our bound suggests that algorithms that maximize the ranking margin generalize well.
We then describe a new algorithm, Smooth Margin Ranking, that precisely converges to a maximum ranking-margin solution. The algorithm is a modification of RankBoost, analogous to Approximate Coordinate Ascent Boosting.
We also prove a remarkable property of AdaBoost: under very natural conditions, AdaBoost maximizes the exponentiated loss associated with the AUC and achieves the same AUC as RankBoost. This explains the empirical observations made by Cortes and Mohri, and Caruana and Niculescu-Mizil, about the excellent performance of AdaBoost as a ranking algorithm, as measured by the AUC.
This work is partially supported by NSF grant CCR-0325463.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Agarwal, S., Graepel, T., Herbich, R., Har-Peled, S., Roth, D.: Generalization bounds for the area under the ROC curve. Journal of Machine Learning Research 6, 393–425 (2005)
Bousquet, O.: New approaches to statistical learning theory. Annals of the Institute of Statistical Mathematics 55(2), 371–389 (2003)
Caruana, R., Niculescu-Mizil An, A.: empirical comparison of supervised learning algorithms using difference performance metrics. Technical Report TR2005-1973, Cornell University (2005)
Collins, M., Schapire, R.E., Singer, Y.: Logistic regression, AdaBoost and Bregman distances. Machine Learning 48(1/2/3) (2002)
Cortes, C., Mohri, M.: AUC optimization vs. error rate minimization. Advances in Neural Information Processing Systems 16 (2004)
Cucker, F., Smale, S.: On the mathematical foundations of learning. Bull. Amer. Math. Soc. (39), 1–49 (2002)
Freund, Y., Iyer, R., Schapire, R.E., Singer, Y.: An efficient boosting algorithm for combining preferences. Machine Learning: Proceedings of the Fifteenth International Conference (1998)
Freund, Y., Schapire, R.E.: Adaptive game playing using multiplicative weights. Games and Economic Behavior 29, 79–103 (1999)
Friedman, J., Hastie, T., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. The Annals of Statistics 38(2), 337–374 (2000)
Koltchinskii, V., Panchenko, D.: Empirical margin distributions and bounding the generalization error of combined classifiers. The Annals of Statistics 30(1) (February 2002)
Lafferty, J.D., Pietra, S.D., Pietra, V.D.: Statistical learning algorithms based on Bregman distances. In: Proceedings of the Canadian Workshop on Information Theory (1997)
Rudin, C., Daubechies, I., Schapire, R.E.: The dynamics of AdaBoost: Cyclic behavior and convergence of margins. Journal of Machine Learning Research 5, 1557–1595 (2004)
Rudin, C., Schapire, R.E., Daubechies, I.: Analysis of boosting algorithms using the smooth margin function: A study of three algorithms. Submitted (2004)
Rudin, C., Schapire, R.E., Daubechies, I.: Boosting based on a smooth margin. In: Proceedings of the Sixteenth Annual Conference on Computational Learning Theory, pp. 502–517 (2004)
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: A new explanation for the effectiveness of voting methods. The Annals of Statistics 26(5), 1651–1686 (1998)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rudin, C., Cortes, C., Mohri, M., Schapire, R.E. (2005). Margin-Based Ranking Meets Boosting in the Middle. In: Auer, P., Meir, R. (eds) Learning Theory. COLT 2005. Lecture Notes in Computer Science(), vol 3559. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11503415_5
Download citation
DOI: https://doi.org/10.1007/11503415_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26556-6
Online ISBN: 978-3-540-31892-7
eBook Packages: Computer ScienceComputer Science (R0)