Synonyms
Non-linear dimension reduction methods
Definition
Dimension reduction refers to the problem of constructing a meaningful low-dimensional representation of high-dimensional data. A dimension reduction technique is generally associated with a map from a high-dimensional input space to a low-dimensional output space. If the associated map is nonlinear, the dimension reduction technique is known as a nonlinear dimension reduction technique.
Introduction
Dimension reduction is the construction of a meaningful low-dimensional representation of high-dimensional data. Since there are large volumes of high-dimensional data (such as climate patterns, stellar spectra, or gene distributions) in numerous real-world applications, dimension reduction is a fundamental problem in many scientific fields. From the perspective of pattern recognition, dimension reduction is an effective means of avoiding the “curse of dimensionality” and improving the computational efficiency of pattern matching.
R...
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
B. Schölkopf, A. Smola, K.R. Muller, Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput. 10(5), 1299–1319 (1998)
S. Mika, G. Rätsch, J. Weston, B. Schölkopf, K.R. Müller, Fisher discriminant analysis with kernels, in IEEE International Workshop on Neural Networks for Signal Processing IX, Madison, Aug 1999, pp. 41–48
G. Baudat, F. Anouar, Generalized discriminant analysis using a kernel approach. Neural Comput. 12(10), 2385–2404 (2000)
J.B. Tenenbaum, V. de Silva, J.C. Langford, A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
S.T. Roweis, L.K. Saul, Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
M. Belkin, P. Niyogi, Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)
S. Yan, D. Xu, B. Zhang, H.J. Zhang, Q. Yang, S. Lin, Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans. Pattern Anal. Mach. Intell. 29(1), 40–51 (2007)
B. Schölkopf, SVM and Kernel method, http://www.kernel-machines.org/
J. Yang, Z. Jin, J.Y. Yang, D. Zhang, A.F. Frangi, Essence of kernel fisher discriminant: KPCA plus LDA. Pattern Recognit. 37(10), 2097–2100 (2004)
J. Yang, A.F. Frangi, J.Y. Yang, D. Zhang, J. Zhong, KPCA plus LDA: a complete kernel fisher discriminant framework for feature extraction and recognition. IEEE Trans. Pattern. Anal. Mach. Intell. 27(2), 230–244 (2005)
C. Grimes, D. Donoho, Can these things really work? Theoretical Results for ISOMAP and LLE, a presentation at the Workshop of Spectral Methods in Dimensionality Reduction, Clustering, and Classification in NIPS 2002, http://www.cse.msu.edu/~lawhiu/manifold/
L. McMillan, Dimensionality reduction Part 2: nonlinear methods, http://www.cs.unc.edu/Courses/comp290-90-f03/
J. Yang, D. Zhang, J.Y. Yang, B. Niu, Globally maximizing, locally minimizing: unsupervised discriminant projection with applications to face and palm biometrics. IEEE Trans. Pattern Anal. Mach. Intell. 29(4), 650–664 (2007)
J. Ham, D. Lee, S. Mika, B. Schölkopf, A kernel view of the dimensionality reduction of manifolds, in Proceedings of the Twenty-First International Conference on Machine Learning, Alberta, 2004, pp. 369–376
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer Science+Business Media New York
About this entry
Cite this entry
Yang, J., Jin, Z., Yang, J. (2015). Nonlinear Techniques for Dimension Reduction. In: Li, S.Z., Jain, A.K. (eds) Encyclopedia of Biometrics. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7488-4_294
Download citation
DOI: https://doi.org/10.1007/978-1-4899-7488-4_294
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-7487-7
Online ISBN: 978-1-4899-7488-4
eBook Packages: Computer ScienceReference Module Computer Science and Engineering