Skip to main content

A Leave-One-out Cross Validation Bound for Kernel Methods with Applications in Learning

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2111))

Abstract

In this paper, we prove a general leave-one-out style crossvalidation bound for Kernel methods. We apply this bound to some classification and regression problems, and compare the results with previously known bounds. One aspect of our analysis is that the derived expected generalization bounds reflect both approximation (bias) and learning (variance) properties of the underlying kernel methods. We are thus able to demonstrate the universality of certain learning formulations.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Olivier Bousquet and André Elisseeff. Algorithmic stability and generalization performance. In Advances in Neural Information Processing Systems 13, pages 196–202, 2001.

    Google Scholar 

  2. Nello Cristianini and John Shawe-Taylor. An Introduction to Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press, 2000.

    Google Scholar 

  3. Jüergen Forster and Manfred Warmuth. Relative expected instantaneous loss bounds. In COLT 00, pages 90–99, 2000.

    Google Scholar 

  4. Jüergen Forster and Manfred Warmuth. Relative loss bounds for temporaldifference learning. In ICML00, pages 295–302, 2000.

    Google Scholar 

  5. T. Jaakkola and D. Haussler. Probabilistic kernel regression models. In Proceedings of the 1999 Conference on AI and Statistics, 1999.

    Google Scholar 

  6. Michael Kearns and Dana Ron. Algorithmic stability and sanity-check bounds for leave-one-out cross-validation. Neural Computation, 11(6):1427–1453, 1999.

    Article  Google Scholar 

  7. H.N. Mhaskar, F.J. Narcowich, and J.D. Ward. Approximation properties of zonal function networks using scattered data on the sphere. Adv. Comput. Math., 11(2-3):121–137, 1999. Radial basis functions and their applications.

    Article  MATH  MathSciNet  Google Scholar 

  8. F.J. Narcowich, N. Sivakumar, and J.D. Ward. Stability results for scattered-data interpolation on Euclidean spheres. Adv. Comput. Math., 8(3):137–163, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  9. R. Tyrrell Rockafellar. Convex analysis. Princeton University Press, Princeton, NJ, 1970.

    MATH  Google Scholar 

  10. V.N. Vapnik. Statistical learning theory. John Wiley & Sons, New York, 1998.

    MATH  Google Scholar 

  11. Holger Wendland. Error estimates for interpolation by compactly supported radial basis functions of minimal degree. J. Approx. Theory, 93(2):258–272, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  12. Tong Zhang. Convergence of large margin separable linear classification. In Advances in Neural Information Processing Systems 13, pages 357–363, 2001.

    Google Scholar 

  13. Tong Zhang. A sequential approximation bound for some sample-dependent convex optimization problems with applications in learning. In COLT, 2001.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, T. (2001). A Leave-One-out Cross Validation Bound for Kernel Methods with Applications in Learning. In: Helmbold, D., Williamson, B. (eds) Computational Learning Theory. COLT 2001. Lecture Notes in Computer Science(), vol 2111. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44581-1_28

Download citation

  • DOI: https://doi.org/10.1007/3-540-44581-1_28

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42343-0

  • Online ISBN: 978-3-540-44581-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics