Skip to main content

Stability of Dimensionality Reduction Methods Applied on Artificial Hyperspectral Images

  • Conference paper
Computer Vision and Graphics (ICCVG 2012)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 7594))

Included in the following conference series:

Abstract

Dimensionality reduction is a big challenge in many areas. In this research we address the problem of high-dimensional hyperspectral images in which we are aiming to preserve its information quality. This paper introduces a study stability of the non parametric and unsupervised methods of projection and of bands selection used in dimensionality reduction of different noise levels determined with different numbers of data points. The quality criteria based on the norm and correlation are employed obtaining a good preservation of these artificial data in the reduced dimensions. The added value of these criteria can be illustrated in the evaluation of the reduction’s performance, when considering the stability of two categories of bands selection methods and projection methods. The performances of the method are verified on artificial data sets for validation. An hybridization for a better stability is proposed in this paper, Band Clustering (BandClust) with Multidimensional Scaling (MDS) for dimensionality reduction. Examples are given to demonstrate the hybridization originality and relevance(BandClust/MDS) of the analysis carried out in this paper.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Grahn, H., Geladi, P. (eds.): Techniques and Applications of Hyperspectral Image Analysis. Wiley, Chichester (2007)

    Google Scholar 

  2. Richards, J.A.: Remote Sensing Digital Image Analysis: An Introduction, 2nd edn. Springer (1993)

    Google Scholar 

  3. Gao, Shi, Q., Caetano, T.S.: Dimensionality reduction via compressive sensing. Pattern Recognition Letters 33(9), 1163–1170 (2012)

    Article  Google Scholar 

  4. Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press Professional, Inc., San Diego (1990)

    MATH  Google Scholar 

  5. Álvarez-Meza, A., Valencia-Aguirre, J., Daza-Santacoloma, G., Castellanos-Domínguez, G.: Global and local choice of the number of nearest neighbors in locally linear embedding. Pattern Recognition Letters 32(16), 2171–2177 (2011)

    Article  Google Scholar 

  6. Pan, Y., Ge, S.S., Mamun, A.A.: Weighted locally linear embedding for dimension reduction. Pattern Recognition 42(5), 798–811 (2009)

    Article  MATH  Google Scholar 

  7. Wahba, G.: Spline models for observational data. CBMS-NSF Regional Conference series in applied mathematics. Society for Industrial and Applied Mathematics, Philadelphia (1990)

    Book  MATH  Google Scholar 

  8. Scholkopf, B., Smola, A., Muller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation 10, 1299–1319 (1998)

    Article  Google Scholar 

  9. Jiao, Y., Wu, Y., Hou, C., Zhang, C.: Stable local dimensionality reduction approaches. Pattern Recognition 42(9), 2054–2066 (2006)

    Google Scholar 

  10. Tsai, F.S.: Comparative Study of Dimensionality Reduction Techniques for Data Visualization. Journal of Artificial Intelligence 3(3), 119–134 (2010)

    Article  Google Scholar 

  11. Geng, X., Zhan, D.C., Zhou, Z.H.: Supervised nonlinear dimensionality reduction for visualization and classification. IEEE Trans. Syst. Man Cybernetics Part B 35, 1098–1107 (2005)

    Article  Google Scholar 

  12. Tsai, F.S., Chan, K.L.: A manifold visualization metric for dimensionality reduction. Nanyang Technological University Technical Report (2009)

    Google Scholar 

  13. Hotelling, H.: Analysis of a complex of statistical variables into principal components. Journal of Educational Psychology 24, 417–441 (1933)

    Article  Google Scholar 

  14. Pearson, K.: On lines and planes of closest fit to systems of points in space. Philiosophical Magazine 2, 559–572 (1901)

    Article  Google Scholar 

  15. Williams, C.K.I., Cand, D.: Barber. Bayesian classification with processes. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(12), 1342–1351 (1998)

    Article  Google Scholar 

  16. Vapnik, V.N.: The nature of statistical learning theory. Springer, New York (1995)

    Book  MATH  Google Scholar 

  17. Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction via local tangent space alignment. SIAM Journal of Scientific Computing 26(1), 313–338 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  18. Nadler, B., Lafon, B., Coifman, R.R., Kevrekidis, I.G.: Diffusion maps, spectral clustering and the reaction coordinate of dynamical systems. Applied and Computational Harmonic Analysis: Special Issue on Diffusion Maps and Wavelets 21, 113–127 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  19. De Backer, S., Naud, A., Scheunders, P.: Non-linear dimensionality reduction techniques for unsupervised feature extraction. Original Research Article Pattern Recognition Letters 19(8), 711–720 (1998)

    Article  MATH  Google Scholar 

  20. Lee, J.A., Verleysen, M.: Nonlinear dimensionality reduction. Springer, New York (2007)

    Book  MATH  Google Scholar 

  21. Donoho, D.L., Grimes, C.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. PNAS 100, 5591–5596 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  22. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)

    Article  MATH  Google Scholar 

  23. vanderMaaten, L.J.P.: An introduction to dimensionality reduction using Matlab. Technical Report 07-06, MICC-IKAT, Maastricht University, Maastricht, The Netherlands (2007)

    Google Scholar 

  24. Baofeng, S.R., Gunn, G., Damper, R.I., Nelson, J.D.B.: Band Selection for Hyperspectral Image Classification Using Mutual Information. IEEE Geoscience and Remote Controle Sensing Letters 3(4), 522–526 (2006)

    Article  Google Scholar 

  25. Eskicioglu, M., Fisher, P.S.: A survey of quality measures for gray scale image compression. In: 9th Computing in Aerospace Conference, pp. 49–61. AIAA (October 1993)

    Google Scholar 

  26. Hongtao, D., Hairong, Q., Wang, X., Ramanath, R., Snyder, W.E.: Band selection using component analysis for hyperspectral image processing. In: 32nd Workshop for Applied Imagery, Pattern Recognition, pp. 93–98 (2003)

    Google Scholar 

  27. Yang, J., Frangi, A.F., Yang, J., Jin, Z.: KPCA Plus LDA: A Complete Kernel Fisher Discriminant Framework for Feature Extraction and Recognition. IEEE Trans. Pattern Anal. Mach. Intell. 27(2), 230–244 (2005)

    Article  Google Scholar 

  28. Saxena, A., Gupta, A., Mukerjee, A.: Non-linear Dimensionality Reduction by Locally Linear Isomaps. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 1038–1043. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Khoder, J., Younes, R., Ouezdou, F.B. (2012). Stability of Dimensionality Reduction Methods Applied on Artificial Hyperspectral Images. In: Bolc, L., Tadeusiewicz, R., Chmielewski, L.J., Wojciechowski, K. (eds) Computer Vision and Graphics. ICCVG 2012. Lecture Notes in Computer Science, vol 7594. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33564-8_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33564-8_56

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33563-1

  • Online ISBN: 978-3-642-33564-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics