Abstract
The detection of structures in high-dimensional data has an important part to play in machine learning. Recently, we proposed a fast iterative strategy for non-linear dimensionality reduction based on the unsupervised formulation of K-nearest neighbor regression. As the unsupervised nearest neighbor (UNN) optimization problem does not allow the computation of derivatives, the employment of direct search methods is reasonable. In this paper we introduce evolutionary optimization approaches for learning UNN embeddings. Two continuous variants are based on the CMA-ES employing regularization with domain restriction, and penalizing extension in latent space. A combinatorial variant is based on embedding the latent variables on a grid, and performing stochastic swaps. We compare the results on artificial dimensionality reduction problems.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Beyer, H.G., Schwefel, H.P.: Evolution strategies - A comprehensive introduction. Natural Computing 1, 3–52 (2002)
Carreira-Perpiñán, M.Á., Lu, Z.: Parametric dimensionality reduction by unsupervised regression. In: Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1895–1902 (2010)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, Berlin (2009)
Jolliffe, I.: Principal component analysis. Springer Series in Statistics. Springer, New York (1986)
Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: General cost functions. Neurocomputing 70(7-9), 1289–1303 (2007)
Kramer, O.: Dimensionality reduction by unsupervised k-nearest neighbor regression. In: Proceedings of the International Conference on Machine Learning and Applications (ICMLA), pp. 275–278. IEEE Computer Society Press (2011)
Kramer, O., Gieseke, F.: A stochastic optimization approach for unsupervised kernel regression. In: Genetic and Evolutionary Methods (GEM), pp. 156–161 (2011)
Lawrence, N.D.: Probabilistic non-linear principal component analysis with gaussian process latent variable models. Journal of Machine Learning Research 6, 1783–1816 (2005)
Meinicke, P.: Unsupervised Learning in a Generalized Regression Framework. Ph.D. thesis, University of Bielefeld (2000)
Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from unsupervised kernel regression. IEEE Trans. Pattern Anal. Mach. Intell. 27(9), 1379–1391 (2005)
Ostermeier, A., Gawelczyk, A., Hansen, N.: A derandomized approach to self adaptation of evolution strategies. Evolutionary Computation 2(4), 369–380 (1994)
Pearson, K.: On lines and planes of closest fit to systems of points in space. Philosophical Magazine 2(6), 559–572 (1901)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Smola, A.J., Mika, S., Schölkopf, B., Williamson, R.C.: Regularized principal manifolds. J. Mach. Learn. Res. 1, 179–209 (2001)
Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kramer, O. (2012). On Evolutionary Approaches to Unsupervised Nearest Neighbor Regression. In: Di Chio, C., et al. Applications of Evolutionary Computation. EvoApplications 2012. Lecture Notes in Computer Science, vol 7248. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29178-4_35
Download citation
DOI: https://doi.org/10.1007/978-3-642-29178-4_35
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-29177-7
Online ISBN: 978-3-642-29178-4
eBook Packages: Computer ScienceComputer Science (R0)