Skip to main content

Leaving Local Optima in Unsupervised Kernel Regression

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8681))

Abstract

Embedding high-dimensional patterns in low-dimensional latent spaces is a challenging task. In this paper, we introduce re-sampling strategies to leave local optima in the data space reconstruction error (DSRE) minimization process of unsupervised kernel regression (UKR). For this sake, we concentrate on a hybrid UKR variant that combines iterative solution construction with gradient descent based optimization. Patterns with high reconstruction errors are removed from the manifold and re-sampled based on Gaussian sampling. Re-sampling variants consider different pattern reconstruction errors, varying numbers of re-sampled patterns, and termination conditions. The re-sampling process with UKR can also improve ISOMAP embeddings. Experiments on typical benchmark data sets illustrate the capabilities of strategies for leaving optima.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bache, K., Lichman, M.: UCI machine learning repository (2013)

    Google Scholar 

  2. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2009)

    Book  MATH  Google Scholar 

  3. Hull, J.: A database for handwritten text recognition research. IEEE PAMI 5(16), 550–554 (1994)

    Article  Google Scholar 

  4. Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: General cost functions. Neurocomputing 70(7-9), 1289–1303 (2007)

    Article  Google Scholar 

  5. Kramer, O.: Dimensionality reduction by unsupervised k-nearest neighbor regression. In: International Conference on Machine Learning and Applications (ICMLA), pp. 275–278 (2011)

    Google Scholar 

  6. Lee, J.A., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer (2007)

    Google Scholar 

  7. Nadaraya, E.: On estimating regression. Theory of Probability and Its Application 10, 186–190 (1964)

    Article  Google Scholar 

  8. Nocedal, J., Wright, S.J.: Numerical Optimization. Springer (2000)

    Google Scholar 

  9. Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Monographs on Statistics and Applied Probability, vol. 26. Chapman and Hall, London (1986)

    Book  MATH  Google Scholar 

  10. Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Lückehe, D., Kramer, O. (2014). Leaving Local Optima in Unsupervised Kernel Regression. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11179-7_18

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11178-0

  • Online ISBN: 978-3-319-11179-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics