Abstract
In this paper, we prove a general leave-one-out style crossvalidation bound for Kernel methods. We apply this bound to some classification and regression problems, and compare the results with previously known bounds. One aspect of our analysis is that the derived expected generalization bounds reflect both approximation (bias) and learning (variance) properties of the underlying kernel methods. We are thus able to demonstrate the universality of certain learning formulations.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Olivier Bousquet and André Elisseeff. Algorithmic stability and generalization performance. In Advances in Neural Information Processing Systems 13, pages 196–202, 2001.
Nello Cristianini and John Shawe-Taylor. An Introduction to Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press, 2000.
Jüergen Forster and Manfred Warmuth. Relative expected instantaneous loss bounds. In COLT 00, pages 90–99, 2000.
Jüergen Forster and Manfred Warmuth. Relative loss bounds for temporaldifference learning. In ICML00, pages 295–302, 2000.
T. Jaakkola and D. Haussler. Probabilistic kernel regression models. In Proceedings of the 1999 Conference on AI and Statistics, 1999.
Michael Kearns and Dana Ron. Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Neural Computation, 11(6):1427–1453, 1999.
H.N. Mhaskar, F.J. Narcowich, and J.D. Ward. Approximation properties of zonal function networks using scattered data on the sphere. Adv. Comput. Math., 11(2-3):121–137, 1999. Radial basis functions and their applications.
F.J. Narcowich, N. Sivakumar, and J.D. Ward. Stability results for scattered-data interpolation on Euclidean spheres. Adv. Comput. Math., 8(3):137–163, 1998.
R. Tyrrell Rockafellar. Convex analysis. Princeton University Press, Princeton, NJ, 1970.
V.N. Vapnik. Statistical learning theory. John Wiley & Sons, New York, 1998.
Holger Wendland. Error estimates for interpolation by compactly supported radial basis functions of minimal degree. J. Approx. Theory, 93(2):258–272, 1998.
Tong Zhang. Convergence of large margin separable linear classification. In Advances in Neural Information Processing Systems 13, pages 357–363, 2001.
Tong Zhang. A sequential approximation bound for some sample-dependent convex optimization problems with applications in learning. In COLT, 2001.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhang, T. (2001). A Leave-One-out Cross Validation Bound for Kernel Methods with Applications in Learning. In: Helmbold, D., Williamson, B. (eds) Computational Learning Theory. COLT 2001. Lecture Notes in Computer Science(), vol 2111. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44581-1_28
Download citation
DOI: https://doi.org/10.1007/3-540-44581-1_28
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-42343-0
Online ISBN: 978-3-540-44581-4
eBook Packages: Springer Book Archive