Abstract
Embedding high-dimensional patterns in low-dimensional latent spaces is a challenging task. In this paper, we introduce re-sampling strategies to leave local optima in the data space reconstruction error (DSRE) minimization process of unsupervised kernel regression (UKR). For this sake, we concentrate on a hybrid UKR variant that combines iterative solution construction with gradient descent based optimization. Patterns with high reconstruction errors are removed from the manifold and re-sampled based on Gaussian sampling. Re-sampling variants consider different pattern reconstruction errors, varying numbers of re-sampled patterns, and termination conditions. The re-sampling process with UKR can also improve ISOMAP embeddings. Experiments on typical benchmark data sets illustrate the capabilities of strategies for leaving optima.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Bache, K., Lichman, M.: UCI machine learning repository (2013)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2009)
Hull, J.: A database for handwritten text recognition research. IEEE PAMI 5(16), 550–554 (1994)
Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: General cost functions. Neurocomputing 70(7-9), 1289–1303 (2007)
Kramer, O.: Dimensionality reduction by unsupervised k-nearest neighbor regression. In: International Conference on Machine Learning and Applications (ICMLA), pp. 275–278 (2011)
Lee, J.A., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer (2007)
Nadaraya, E.: On estimating regression. Theory of Probability and Its Application 10, 186–190 (1964)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer (2000)
Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Monographs on Statistics and Applied Probability, vol. 26. Chapman and Hall, London (1986)
Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this paper
Cite this paper
Lückehe, D., Kramer, O. (2014). Leaving Local Optima in Unsupervised Kernel Regression. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_18
Download citation
DOI: https://doi.org/10.1007/978-3-319-11179-7_18
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-11178-0
Online ISBN: 978-3-319-11179-7
eBook Packages: Computer ScienceComputer Science (R0)