Abstract
Dimension reduction is an important task in the field of machine learning. Local Linear Embedding (LLE) and Isometric Map (ISOMAP) are two representative manifold learning methods for dimension reduction. Both the two methods have some shortcomings. The most significant one is that they preserve only one specific feature of the underlying datasets after dimension reduction, while ignoring other meaningful features. In this paper, we propose a new method to deal with this problem, it is called G lobal and L ocal feature P reserving E mbedding, GLPE in short. GLPE can preserve both the neighborhood relationships and the global pairwise distances of high-dimensional datasets. Experiments on both artificial and real-life datasets validate the effectiveness of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Jolliffe, I.T.: Principal Component Analysis. Springer, New York (2002)
Cox, T.F., Cox, M.A.A.: Multidimensional scaling. Chapman and Hall, London (1994)
Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)
Saul, L.K., Roweis, S.T.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. The Journal of Machine Learning Research 4, 119–155 (2003)
Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)
Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15, 1373–1396 (2003)
Zhang, Z.Y., Zha, H.Y.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. Scientific Computing 26, 313–338 (2002)
Weinberger, K.Q., Saul, L.K.: An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In: American Association for Artificial Intelligence (AAAI) (2006)
Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences 100, 5591–5596 (2003)
Coifman, R.R., Lafon, S.: Diffusion Maps. Computational and Harmonic Analysis (2006)
Schölkopf, B., Smola, A., Müller, K.-R.: Kernel principal component analysis. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 583–588. Springer, Heidelberg (1997)
Schölkopf, B., Smola, A., Müller, K.R.: Kernel principal component analysis. In: Advances in Kernel Methods-Support Vector Learning, pp. 387–352 (1999)
Choi, H., Choi, S.: Kernel isomap. Electronics Letters 40, 1612–1613 (2004)
Choi, H., Choi, S.: Robust kernel isomap. Pattern Recognition 40(3), 853–862 (2007)
Wang, G.S.: Properties and construction methods of kernel in support vector machine. Computer Science 33(6), 172–174 (2006)
Ham, J., Lee, D., Mika, S., Schölkopf, B.: A Kernel View of the Dimensionality Reduction of Manifolds. In: Proceedings of the Twenty-First International Conference on Machine Learning, pp. 47–54. IEEE, New York (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Tan, C., Chen, C., Guan, J. (2013). A Nonlinear Dimension Reduction Method with Both Distance and Neighborhood Preservation. In: Wang, M. (eds) Knowledge Science, Engineering and Management. KSEM 2013. Lecture Notes in Computer Science(), vol 8041. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39787-5_5
Download citation
DOI: https://doi.org/10.1007/978-3-642-39787-5_5
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-39786-8
Online ISBN: 978-3-642-39787-5
eBook Packages: Computer ScienceComputer Science (R0)