Abstract
The task of region classification is to construct class regions containing the correct classes of the objects being classified with an error probability ε ∈ [0,1]. To turn a point classifier into a region classifier, the conformal framework is employed [11,14]. However, to apply the framework we need to design a non-conformity function. This function has to estimate the instance’s non-conformity for the point classifier used.
This paper introduces a new non-conformity function for AdaBoost. The function has two main advantages over the only existing non-conformity function for AdaBoost. First, it reduces the time complexity of computing class regions with a factor equal to the size of the training data. Second, it results in statistically better class regions.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Asuncion, A., Newman, D.J.: UCI machine learning repository (2007)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)
Caprile, B., Furlanello, C., Merler, S.: Highlighting hard patterns via adaboost weights evolution. In: Roli, F., Kittler, J. (eds.) MCS 2002. LNCS (LNAI), vol. 2364, pp. 72–80. Springer, Heidelberg (2002)
Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, Heidelberg (2002)
Holte, R.C.: Very simple classification rules perform well on most commonly used datasets. Machine Learning 11, 63–91 (1993)
Niculescu-Mizil, A., Caruana, R.: Obtaining calibrated probabilities from boosting. In: Proceedings of the 21st Conference in Uncertainty in Artificial Intelligence, Edinburgh, Scotland, July 26-29, pp. 413–418 (2005)
Proedrou, K.: Rigorous Measures of Confidence for Pattern Recognition and Regression. PhD thesis, Royal Holloway College, University of London, UK (2003)
Rätsch, G., Onoda, T., Müller, K.-R.: Soft margins for adaboost. Machine Learning 42(3), 287–320 (2001)
Schapire, R.: The boosting approach to machine learning: An overview. In: MSRI Workshop on Nonlinear Estimation and Classification, Berkeley, CA (March 2001)
Shafer, G., Vovk, V.: A tutorial on conformal prediction. Journal of Machine Learning Research 9, 371–421 (2008)
Smirnov, E.N., Vanderlooy, S., Sprinkhuizen-Kuyper, I.G.: Meta-typicalness approach to reliable classification. In: Proceedings of the 17th European Conference on Artificial Intelligence, Riva del Garda, Italy, August 28 - September 1, pp. 810–811. IOS Press, Amsterdam (2006)
Vanderlooy, S., van der Maaten, L., Sprinkhuizen-Kuyper, I.: Off-line learning with transductive confidence machines: an empirical evaluation. In: Perner, P. (ed.) MLDM 2007. LNCS (LNAI), vol. 4571, pp. 310–323. Springer, Heidelberg (2007)
Vovk, V., Gammerman, A., Shafer, G.: Algorithmic learning in a random world. Springer, Heidelberg (2005)
Zadrozny, B., Elkan, C.: Obtaining calibrated probability estimates from decision trees and naive Bayesian classifiers. In: Proceedings of the 18th International Conference on Machine Learning, pp. 609–616. Morgan Kaufmann, San Francisco (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Moed, M., Smirnov, E.N. (2009). Efficient AdaBoost Region Classification. In: Perner, P. (eds) Machine Learning and Data Mining in Pattern Recognition. MLDM 2009. Lecture Notes in Computer Science(), vol 5632. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03070-3_10
Download citation
DOI: https://doi.org/10.1007/978-3-642-03070-3_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03069-7
Online ISBN: 978-3-642-03070-3
eBook Packages: Computer ScienceComputer Science (R0)