Skip to main content

A Nonlinear Dimension Reduction Method with Both Distance and Neighborhood Preservation

  • Conference paper
Knowledge Science, Engineering and Management (KSEM 2013)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 8041))

Abstract

Dimension reduction is an important task in the field of machine learning. Local Linear Embedding (LLE) and Isometric Map (ISOMAP) are two representative manifold learning methods for dimension reduction. Both the two methods have some shortcomings. The most significant one is that they preserve only one specific feature of the underlying datasets after dimension reduction, while ignoring other meaningful features. In this paper, we propose a new method to deal with this problem, it is called G lobal and L ocal feature P reserving E mbedding, GLPE in short. GLPE can preserve both the neighborhood relationships and the global pairwise distances of high-dimensional datasets. Experiments on both artificial and real-life datasets validate the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Jolliffe, I.T.: Principal Component Analysis. Springer, New York (2002)

    MATH  Google Scholar 

  2. Cox, T.F., Cox, M.A.A.: Multidimensional scaling. Chapman and Hall, London (1994)

    MATH  Google Scholar 

  3. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  4. Saul, L.K., Roweis, S.T.: Think globally, fit locally: Unsupervised learning of low dimensional manifolds. The Journal of Machine Learning Research 4, 119–155 (2003)

    MathSciNet  Google Scholar 

  5. Tenenbaum, J.B., De Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

  6. Belkin, M., Niyogi, P.: Laplacian Eigenmaps for Dimensionality Reduction and Data Representation. Neural Computation 15, 1373–1396 (2003)

    Article  MATH  Google Scholar 

  7. Zhang, Z.Y., Zha, H.Y.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. Scientific Computing 26, 313–338 (2002)

    Article  MathSciNet  Google Scholar 

  8. Weinberger, K.Q., Saul, L.K.: An introduction to nonlinear dimensionality reduction by maximum variance unfolding. In: American Association for Artificial Intelligence (AAAI) (2006)

    Google Scholar 

  9. Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences 100, 5591–5596 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  10. Coifman, R.R., Lafon, S.: Diffusion Maps. Computational and Harmonic Analysis (2006)

    Google Scholar 

  11. Schölkopf, B., Smola, A., Müller, K.-R.: Kernel principal component analysis. In: Gerstner, W., Hasler, M., Germond, A., Nicoud, J.-D. (eds.) ICANN 1997. LNCS, vol. 1327, pp. 583–588. Springer, Heidelberg (1997)

    Google Scholar 

  12. Schölkopf, B., Smola, A., Müller, K.R.: Kernel principal component analysis. In: Advances in Kernel Methods-Support Vector Learning, pp. 387–352 (1999)

    Google Scholar 

  13. Choi, H., Choi, S.: Kernel isomap. Electronics Letters 40, 1612–1613 (2004)

    Article  Google Scholar 

  14. Choi, H., Choi, S.: Robust kernel isomap. Pattern Recognition 40(3), 853–862 (2007)

    Article  MATH  Google Scholar 

  15. Wang, G.S.: Properties and construction methods of kernel in support vector machine. Computer Science 33(6), 172–174 (2006)

    Google Scholar 

  16. Ham, J., Lee, D., Mika, S., Schölkopf, B.: A Kernel View of the Dimensionality Reduction of Manifolds. In: Proceedings of the Twenty-First International Conference on Machine Learning, pp. 47–54. IEEE, New York (2004)

    Chapter  Google Scholar 

  17. http://www.cs.nyu.edu/~roweis/data.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Tan, C., Chen, C., Guan, J. (2013). A Nonlinear Dimension Reduction Method with Both Distance and Neighborhood Preservation. In: Wang, M. (eds) Knowledge Science, Engineering and Management. KSEM 2013. Lecture Notes in Computer Science(), vol 8041. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-39787-5_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-39787-5_5

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-39786-8

  • Online ISBN: 978-3-642-39787-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics