Abstract
We show how to employ leave-K-out cross-validation in Unsupervised Kernel Regression, a recent method for learning of nonlinear manifolds. We thereby generalize an already present regularization method, yielding more flexibility without additional computational cost. We demonstrate our method on both toy and real data.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from Unsupervised Kernel Regression. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(9), 1379–1391 (2005)
Nadaraya, E.A.: On estimating regression. Theory of Probability and Its Application 10, 186–190 (1964)
Watson, G.S.: Smooth regression analysis. Sankhya Series A 26, 359–372 (1964)
Bishop, C.M., Svensen, M., Williams, C.K.I.: GTM: The Generative Topographic Mapping. Neural Computation 10(1), 215–234 (1998)
Smola, A.J., Williamson, R.C., Mika, S., Schölkopf, B.: Regularized Principal Manifolds. In: Fischer, P., Simon, H.U. (eds.) EuroCOLT 1999. LNCS (LNAI), vol. 1572, pp. 214–229. Springer, Heidelberg (1999)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)
Belkin, M., Niyogi, P.: Laplacian Eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Klanke, S., Ritter, H.: Variants of Unsupervised Kernel Regression: General loss functions. In: Proc. European Symposium on Artificial Neural Networks (2006) (to appear)
Riedmiller, M., Braun, H.: A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In: Proc. of the IEEE Intl. Conf. on Neural Networks, pp. 586–591 (1993)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Klanke, S., Ritter, H. (2006). A Leave-K-Out Cross-Validation Scheme for Unsupervised Kernel Regression. In: Kollias, S., Stafylopatis, A., Duch, W., Oja, E. (eds) Artificial Neural Networks – ICANN 2006. ICANN 2006. Lecture Notes in Computer Science, vol 4132. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11840930_44
Download citation
DOI: https://doi.org/10.1007/11840930_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-38871-5
Online ISBN: 978-3-540-38873-9
eBook Packages: Computer ScienceComputer Science (R0)