Synonyms
Definition
AdaBoost is an algorithm that builds a classifier by combining additively a set of weak classifiers. The weak classifiers are incorporated sequentially one at a time so that their combination reduces the empirical exponential loss.
Background
Boosting is a procedure to combine several classifiers with weak performance into one with arbitrarily high performance [1, 2] and was originally introduced by Robert Schapire in the machine learning community [3]. AdaBoost is a popular implementation of boosting for binary classification [4]. The enthusiasm generated by boosting, and in particular by AdaBoost, in machine learning can be highlighted via a quote of Breiman [1] saying that AdaBoost with trees is the “best off-the-shelf classifier in the world.” In practice, much of the popularity of AdaBoost is due to both its performance being in the same league as support vector machines [5] and its algorithmic simplicity. In the computer...
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Friedman JH, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2): 337–374
Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. Springer, New York
Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227
Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the 13th international conference on machine learning, Bari, 148–156
Vapnik V (1995) The nature of statistical learning theory. Springer, New York
Viola P, Jones M (2001) Robust real-time object detection. In: Proceedings of IEEE workshop on statistical and computational theories of vision, Vancouver, Canada
Papageorgiou CP, Oren M, Poggio T (1998) A general framework for object detection. In: International conference on computer vision, Bombay, pp 555–562
Sun Y, Li J, Hager W (2004) Two new regularized adaboost algorithms. In: Machine learning and applications, Louisville, pp 41–48
Schapire RE, Singer Y (1998) Improved boosting algorithms using confidence-rated predictions. In: Computational learning theory, Springer, New York, pp 80–91
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media New York
About this entry
Cite this entry
Favaro, P., Vedaldi, A. (2014). AdaBoost. In: Ikeuchi, K. (eds) Computer Vision. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-31439-6_663
Download citation
DOI: https://doi.org/10.1007/978-0-387-31439-6_663
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-30771-8
Online ISBN: 978-0-387-31439-6
eBook Packages: Computer ScienceReference Module Computer Science and Engineering