Abstract
Unsupervised regression is a dimensionality reduction method that allows embedding high-dimensional patterns in low-dimensional latent spaces. In the line of research on iterative unsupervised regression, numerous methodological variants have been proposed in the recent past. This works extends the set of methods by evolutionary embeddings. We propose to use a \((1+\lambda )\)-ES with Rechenberg mutation strength control to iteratively embed patterns and show that the learned manifolds are better with regard to the data space reconstruction error than the embeddings generated with naive Gaussian sampling. Further, we introduce a hybrid optimization approach of alternating gradient descent and the iterative evolutionary embeddings. Experimental comparisons on artificial test data sets confirm the expectation that a hybrid approach is superior or at least competitive to known methods like principal component analysis or Hessian local linear embedding.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Kramer, O.: Dimensionalty reduction by unsupervised nearest neighbor regression. In: International Conference on Machine Learning and Applications (ICMLA), pp. 275–278. IEEE (2011)
Kramer, O.: Unsupervised nearest neighbors with kernels. In: Glimm, B., Krüger, A. (eds.) KI 2012: Advances in Artificial Intelligence. LNCS, vol. 7526, pp. 97–106. Springer, Heidelberg (2012)
Meinicke, P., Klanke, S., Memisevic, R., Ritter, H.: Principal surfaces from unsupervised kernel regression. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1379–1391 (2005)
Smola, A.J., Mika, S., Schölkopf, B., Williamson, R.C.: Regularized principal manifolds. J. Mach. Learn. Res. 1, 179–209 (2001)
Lawrence, N.D.: Probabilistic non-linear principal component analysis with gaussian process latent variable models. J. Mach. Learn. Res. 6, 1783–1816 (2005)
Tan, S., Mavrovouniotis, M.: Reducing data dimensionality through optimizing neural network inputs. AIChE J. 41, 1471–1479 (1995)
Kramer, O.: A particle swarm embedding algorithm for nonlinear dimensionality reduction. In: Dorigo, M., Birattari, M., Blum, C., Christensen, A.L., Engelbrecht, A.P., Groß, R., Stützle, T. (eds.) ANTS 2012. LNCS, vol. 7461, pp. 1–12. Springer, Heidelberg (2012)
Nourashrafeddin, S., Arnold, D., Milios, E.E.: An evolutionary subspace clustering algorithm for high-dimensional data. In: Proceedings of the Annual Conference on Genetic and Evolutionary Computation (GECCO), pp. 1497–1498 (2012)
Vahdat, A., Heywood, M.I., Zincir-Heywood, A.N.: Bottom-up evolutionary subspace clustering. In: IEEE Congress on Evolutionary Computation, pp. 1–8 (2010)
Nadaraya, E.: On estimating regression. Theory Probab. Appl. 10, 186–190 (1964)
Rechenberg, I.: Cybernetic solution path of an experimental problem. In: Ministry of Aviation, UK, Royal Aircraft Establishment (1965)
Klanke, S., Ritter, H.: Variants of unsupervised kernel regression: general cost functions. Neurocomputing 70, 1289–1303 (2007)
Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, New York (2000)
Jolliffe, I.T.: Principal Component Analysis. Springer Series in Statistics. Springer, New York (1986)
Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. SCIENCE 290, 2323–2326 (2000)
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
A Gradient Descent
A Gradient Descent
HybUKR requires the gradient of the Nadaraya-Watson estimator, which is defined as:
The derivative of the Gaussian kernel function is:
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Lückehe, D., Kramer, O. (2015). Alternating Optimization of Unsupervised Regression with Evolutionary Embeddings. In: Mora, A., Squillero, G. (eds) Applications of Evolutionary Computation. EvoApplications 2015. Lecture Notes in Computer Science(), vol 9028. Springer, Cham. https://doi.org/10.1007/978-3-319-16549-3_38
Download citation
DOI: https://doi.org/10.1007/978-3-319-16549-3_38
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-16548-6
Online ISBN: 978-3-319-16549-3
eBook Packages: Computer ScienceComputer Science (R0)