Skip to main content

On Evolutionary Approaches to Unsupervised Nearest Neighbor Regression

  • Conference paper
Applications of Evolutionary Computation (EvoApplications 2012)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7248))

Included in the following conference series:

Abstract

The detection of structures in high-dimensional data has an important part to play in machine learning. Recently, we proposed a fast iterative strategy for non-linear dimensionality reduction based on the unsupervised formulation of K-nearest neighbor regression. As the unsupervised nearest neighbor (UNN) optimization problem does not allow the computation of derivatives, the employment of direct search methods is reasonable. In this paper we introduce evolutionary optimization approaches for learning UNN embeddings. Two continuous variants are based on the CMA-ES employing regularization with domain restriction, and penalizing extension in latent space. A combinatorial variant is based on embedding the latent variables on a grid, and performing stochastic swaps. We compare the results on artificial dimensionality reduction problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Beyer, H.G., Schwefel, H.P.: Evolution strategies - A comprehensive introduction. Natural Computing 1, 3–52 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  2. Carreira-Perpiñán, M.Á., Lu, Z.: Parametric dimensionality reduction by unsupervised regression. In: Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1895–1902 (2010)

    Google Scholar 

  3. Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2009)

    Book  MATH  Google Scholar 

  4. Jolliffe, I.: Principal component analysis. Springer Series in Statistics. Springer, New York (1986)

    Google Scholar 

  5. Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: General cost functions. Neurocomputing 70(7-9), 1289–1303 (2007)

    Article  Google Scholar 

  6. Kramer, O.: Dimensionality reduction by unsupervised k-nearest neighbor regression. In: Proceedings of the International Conference on Machine Learning and Applications (ICMLA), pp. 275–278. IEEE Computer Society Press (2011)

    Google Scholar 

  7. Kramer, O., Gieseke, F.: A stochastic optimization approach for unsupervised kernel regression. In: Genetic and Evolutionary Methods (GEM), pp. 156–161 (2011)

    Google Scholar 

  8. Lawrence, N.D.: Probabilistic non-linear principal component analysis with gaussian process latent variable models. Journal of Machine Learning Research 6, 1783–1816 (2005)

    MathSciNet  MATH  Google Scholar 

  9. Meinicke, P.: Unsupervised Learning in a Generalized Regression Framework. Ph.D. thesis, University of Bielefeld (2000)

    Google Scholar 

  10. Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from unsupervised kernel regression. IEEE Trans. Pattern Anal. Mach. Intell. 27(9), 1379–1391 (2005)

    Article  Google Scholar 

  11. Ostermeier, A., Gawelczyk, A., Hansen, N.: A derandomized approach to self adaptation of evolution strategies. Evolutionary Computation 2(4), 369–380 (1994)

    Article  Google Scholar 

  12. Pearson, K.: On lines and planes of closest fit to systems of points in space. Philosophical Magazine 2(6), 559–572 (1901)

    Google Scholar 

  13. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  14. Smola, A.J., Mika, S., Schölkopf, B., Williamson, R.C.: Regularized principal manifolds. J. Mach. Learn. Res. 1, 179–209 (2001)

    MathSciNet  MATH  Google Scholar 

  15. Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kramer, O. (2012). On Evolutionary Approaches to Unsupervised Nearest Neighbor Regression. In: Di Chio, C., et al. Applications of Evolutionary Computation. EvoApplications 2012. Lecture Notes in Computer Science, vol 7248. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29178-4_35

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-29178-4_35

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-29177-7

  • Online ISBN: 978-3-642-29178-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics