ABSTRACT
Over the years, many dimensionality reduction algorithms have been proposed for learning the structure of high dimensional data by linearly or non-linearly transforming it into a low-dimensional space. Some techniques can keep the local structure of data, while the others try to preserve the global structure. In this paper, we propose a linear dimensionality reduction technique that characterizes the local and global properties of data by firstly applying k-means algorithm on original data, and then finding the projection by simultaneously globally maximizing the between-cluster scatter matrix and locally minimizing the within-cluster scatter matrix, which actually keeps both local and global structure of data. Low complexity and structure preserving are two main advantages of the proposed technique. The experiments on both artificial and real data sets show the effectiveness and novelty of proposed algorithm in visualization and classification tasks.
- M. Belkin and P. Niyogi. Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in Neural Information Processing Systems 14.Google Scholar
- K. Fukunaga. Introduction to statistical pattern recognition. Academic Press Professional, Inc., San Diego, CA, USA, 1990. Google ScholarDigital Library
- H. Hotelling. Analysis of a complex of statistical variables into principal components. J. Educational Psychology, 27:417--441, 1933.Google ScholarCross Ref
- S. Mika, B. Schölkopf, A. J. Smola, K.-R. Müller, M. Scholz, and G. Rätsch. Kernel pca and de-noising in feature spaces. pages 536--542, 1998. Google ScholarDigital Library
- B. Nadler, S. Lafon, R. R. Coifman, and I. G. Kevrekidis. Diffusion maps, spectral clustering and reaction coordinates of dynamical systems. Applied and Computational Harmonic Analysis, 21.Google Scholar
- K. Pearson. On lines and planes of closest fit to systems of points in space. Philoshophical Magazine, 2:559--572, 1901.Google ScholarCross Ref
- S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323--2326, 2000.Google ScholarCross Ref
- R. N. Shepard. The analysis of proximities: Multidimensional scaling with an unknown distance function. Psychometrika, 27:125--140, 1962.Google ScholarCross Ref
- J. Tenenbaum, V. de Silva, and J. Langford. A global geometric framework for nonlinear dimensionality reduction. 290(5500):2319--2323, December 2000.Google Scholar
- W. S. Torgerson. Multidimensional scaling. Psychometrika, 17:401--419, 1952.Google ScholarCross Ref
- H. Yu and J. Yang. A direct lda algorithm for high-dimensional data - with application to face recognition. Pattern Recognition, 34(10):2067--2070, 2001.Google ScholarCross Ref
- D. Zhang. Polyu palmprint palmprint database - http://www.comp.polyu.edu.hk/biometrics/.Google Scholar
- D. Zhang. Palmprint Authentication. Kluwer Academic, 2004.Google Scholar
Index Terms
- k-means discriminant maps for data visualization and classification
Recommendations
Graph regularized linear discriminant analysis and its generalization
Linear discriminant analysis (LDA) is a powerful dimensionality reduction technique, which has been widely used in many applications. Although, LDA is well-known for its discriminant capability, it clearly does not capture the geometric structure of the ...
Local Tangent Space Discriminant Analysis
We propose a novel supervised dimensionality reduction method named local tangent space discriminant analysis (TSD) which is capable of utilizing the geometrical information from tangent spaces. The proposed method aims to seek an embedding space where ...
Geodesic Discriminant Analysis on Curved Riemannian Manifold
FSKD '09: Proceedings of the 2009 Sixth International Conference on Fuzzy Systems and Knowledge Discovery - Volume 05In this paper, we present a geodesic discriminant analysis(GDA) algorithm, which generalize linear discriminant analysis(LDA) in linear manifold space to curved Riemannian manifold space, with the aid of Riemannian logarithmic map. Compared with LDA, ...
Comments