Abstract
Conformal predictors, introduced by [13], serve to build prediction intervals by exploiting a notion of conformity of the new data point with previously observed data. In the classification problem, conformal predictor may respond to the problem of classification with reject option. In the present paper, we propose a novel method of construction of confidence sets, inspired both by conformal prediction and by classification with reject option. An important aspect of these confidence sets is that, when there are several observations to label, they control the proportion of the data we want to label. Moreover, we introduce a notion of risk adapted to classification with reject option. We show that for this risk, the confidence set risk converges to the risk of the confidence set based on the Bayes classifier.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Audibert, J.Y., Tsybakov, A.: Fast learning rates for plug-in classifiers. Ann. Statist. 35(2), 608–633 (2007)
Bartlett, P., Wegkamp, M.: Classification with a reject option using a hinge loss. J. Mach. Learn. Res. 9, 1823–1840 (2008)
Chow, C.K.: On optimum error and reject trade-off. IEEE Transactions on Information Theory 16, 41–46 (1970)
Dvoretzky, A., Kiefer, J., Wolfowitz, J.: Asymptotic minimax character of the sample distribution function and of the classical multinomial estimator. Ann. Math. Statist. 27, 642–669 (1956)
Freund, Y., Mansour, Y., Schapire, R.: Generalization bounds for averaged classifiers. Ann. Statist. 32(4), 1698–1722 (2004)
Györfi, L., Kohler, M., Krzyżak, A., Walk, H.: A distribution-free theory of nonparametric regression. Springer Series in Statistics. Springer, New York (2002)
Grandvalet, Y., Rakotomamonjy, A., Keshet, J., Canu, S.: Support Vector Machines with a Reject Option. In: Advances in Neural Information Processing Systems (NIPS 2008), vol. 21, pp. 537–544. MIT Press (2009)
Herbei, R., Wegkamp, M.: Classification with reject option. Canad. J. Statist. 34(4), 709–721 (2006)
Massart, P.: The tight constant in the Dvoretzky-Kiefer-Wolfowitz inequality. Ann. Probab. 18(3), 1269–1283 (1990)
Naadeem, M., Zucker, J.D., Hanczar, B.: IN Accuracy-Rejection Curves (ARCs) for Comparing Classification Methods with a Reject Option. MLSB., 65–81 (2010)
Vapnik, V.: Statistical learning theory. Adaptive and Learning Systems for Signal Processing, Communications, and Control. John Wiley & Sons Inc., New York (1998)
Vovk, V., Gammerman, A., Saunders, C.: Machine-learning applications of algorithmic randomness. In: Proceedings of the 16th International Conference on Machine Learning, pp. 444–453 (1999)
Vovk, V., Gammerman, A., Shafer, G.: Algorithmic learning in a random world. Springer, New York (2005)
Wang, J., Shen, X., Pan, W.: On transductive support vector machines. In: Prediction and discovery, Contemp. Math., Amer. Math. Soc., Providence, RI, vol. 443, pp. 7–19 (2007)
Wegkamp, M., Yuan, M.: Support vector machines with a reject option. Bernoulli. 17(4), 1368–1385 (2011)
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Denis, C., Hebiri, M. (2015). Confidence Sets for Classification. In: Gammerman, A., Vovk, V., Papadopoulos, H. (eds) Statistical Learning and Data Sciences. SLDS 2015. Lecture Notes in Computer Science(), vol 9047. Springer, Cham. https://doi.org/10.1007/978-3-319-17091-6_25
Download citation
DOI: https://doi.org/10.1007/978-3-319-17091-6_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-17090-9
Online ISBN: 978-3-319-17091-6
eBook Packages: Computer ScienceComputer Science (R0)