Skip to main content

A Leave-K-Out Cross-Validation Scheme for Unsupervised Kernel Regression

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4132))

Abstract

We show how to employ leave-K-out cross-validation in Unsupervised Kernel Regression, a recent method for learning of nonlinear manifolds. We thereby generalize an already present regularization method, yielding more flexibility without additional computational cost. We demonstrate our method on both toy and real data.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from Unsupervised Kernel Regression. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(9), 1379–1391 (2005)

    Article  Google Scholar 

  2. Nadaraya, E.A.: On estimating regression. Theory of Probability and Its Application 10, 186–190 (1964)

    Article  Google Scholar 

  3. Watson, G.S.: Smooth regression analysis. Sankhya Series A 26, 359–372 (1964)

    MATH  Google Scholar 

  4. Bishop, C.M., Svensen, M., Williams, C.K.I.: GTM: The Generative Topographic Mapping. Neural Computation 10(1), 215–234 (1998)

    Article  Google Scholar 

  5. Smola, A.J., Williamson, R.C., Mika, S., Schölkopf, B.: Regularized Principal Manifolds. In: Fischer, P., Simon, H.U. (eds.) EuroCOLT 1999. LNCS (LNAI), vol. 1572, pp. 214–229. Springer, Heidelberg (1999)

    Chapter  Google Scholar 

  6. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  7. Belkin, M., Niyogi, P.: Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  8. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

  9. Klanke, S., Ritter, H.: Variants of Unsupervised Kernel Regression: General loss functions. In: Proc. European Symposium on Artificial Neural Networks (2006) (to appear)

    Google Scholar 

  10. Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In: Proc. of the IEEE Intl. Conf. on Neural Networks, pp. 586–591 (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Klanke, S., Ritter, H. (2006). A Leave-K-Out Cross-Validation Scheme for Unsupervised Kernel Regression. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4132. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840930_44

Download citation

  • DOI: https://doi.org/10.1007/11840930_44

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-38871-5

  • Online ISBN: 978-3-540-38873-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics