Abstract
In this paper we propose a new algorithm for providing confidence and credibility values for predictions on a multi-class pattern recognition problem which uses Support Vector machines in its implementation. Previous algorithms which have been proposed to achieve this are very processing intensive and are only practical for small data sets. We present here a method which overcomes these limitations and can deal with larger data sets (such as the US Postal Service database). The measures of confidence and credibility given by the algorithm are shown empirically to reflect the quality of the predictions obtained by the algorithm, and are comparable to those given by the less computationally efficient method. In addition to this the overall performance of the algorithm is shown to be comparable to other techniques (such as standard Support Vector machines), which simply give flat predictions and do not provide the extra confidence/credibility measures.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
A. Gammerman, V. Vapnik, and V. Vovk. Learning by transduction. In Uncertainty in Artificial Intelligence, pages 148–155, 1998. 325, 334
A. Hoerl and R.W. Kennard. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1):55–67, 1970. 336
Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, and L. J. Jackel. “Handwritten digit recognition with back-propagation network”. Advances in Neural Information Processing Systems, pages 396–404, 1990. 331
M. Li and P. Vitanyi. An Introduction to Kolmogorov Compexity and Its Applications. Springer, 1997. 326
P. Martin-Löf. The definition of random sequences. Information and Control, 1966.
C. Saunders, A. Gammerman, and V. Vovk. Ridge regression learning algorithm in dual variables. In ICML’ 98. Proceedings of the 15th International Conference on Machine Learning, pages 515–521. Morgan Kaufmann, 1998. 336
C. Saunders, A. Gammerman, and V. Vovk. Transduction with confidence and credibility. In Proceedings of IJCAI’99, volume 2, pages 722–726, 1999. 325, 326, 328, 332
C. Saunders, M.O. Stitson, J. Weston, L. Bottou, B. Schölkopf, and A. Smola. Support Vector machine-reference manual. Technical Report CSD-TR-98-03, Royal Holloway, University of London, 1998. 332
B. Schölkopf, C. Burges, and V. Vapnik. Extracting support data for a given task. In Proceedings, First International Conference on Knowledge Discovery and Data Mining, pages 252–257. AAAI Press, 1995. 330
V. N. Vapnik. Statistical Learning Theory. Wiley, 1998. 325
V. Vovk and A. Gammerman. Algorithmic randomness theory and its applications in computer learning. Technical Report CLRC-TR-00-02, Royal Holloway, University of London, 1999. 326
V. Vovk, A. Gammerman, and C. Saunders. Machine-learning applications of algorithmic randomness. In Proceedings of ICML’ 99, pages 444–453, 1999. 325, 326
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Saunders, C., Gammerman, A., Vovk, V. (2000). Computationally Efficient Transductive Machines. In: Arimura, H., Jain, S., Sharma, A. (eds) Algorithmic Learning Theory. ALT 2000. Lecture Notes in Computer Science(), vol 1968. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-40992-0_25
Download citation
DOI: https://doi.org/10.1007/3-540-40992-0_25
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-41237-3
Online ISBN: 978-3-540-40992-2
eBook Packages: Springer Book Archive