ABSTRACT
We propose a mixed structure to form cascades for AdaBoost classifiers, where parallel strong classifiers are trained for each layer. The structure allows for rapid training and guarantees high hit rates without changing the original threshold. We implemented and tested the approach for two datasets from UCI [1], and compared results of binary classifiers using three different structures: standard AdaBoost, a cascade classifier with threshold adjustments, and the proposed structure.
- A. Asuncion and D. Newman. UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences, 2007. http://www.ics.uci.edu/~mlearn/MLRepository.html.Google Scholar
- Y. Freund and R. E. Schapire. A short introduction to boosting. Journal of Jap. Society for Art Intell., 14(5):771--780, 1999.Google Scholar
- R. Lienhart, L. Liang, and A. Kuranov. A detector tree of boosted classifiers for real-time object detection and tracking. In ICME2003, pages 277--280. IEEE, 2003. Google ScholarDigital Library
- P. Viola and M. Jones. Robust real-time face detection. International Journal of Computer Vision, 57:137--154, 2004. Google ScholarDigital Library
Index Terms
- Empirical evaluation of a new structure for AdaBoost
Recommendations
AdaBoost classifiers for pecan defect classification
Highlights The performance of AdaBoost algorithms were compared with support vector machine and Bayesian classifiers for pecan defect classification. AdaBoost classifiers took least time and gave best classification accuracy. AdaBoost classifiers ...
A New Diverse AdaBoost Classifier
AICI '10: Proceedings of the 2010 International Conference on Artificial Intelligence and Computational Intelligence - Volume 01AdaBoost is one of the most popular algorithms to construct a strong classifier with linear combination of member classifiers. The member classifiers are selected to minimize the errors in each iteration step during training process. AdaBoost provides ...
AdaBoost with SVM-based component classifiers
The use of SVM (Support Vector Machine) as component classifier in AdaBoost may seem like going against the grain of the Boosting principle since SVM is not an easy classifier to train. Moreover, Wickramaratna et al. [2001. Performance degradation in ...
Comments