Skip to main content
Log in

Scaling cut criterion-based discriminant analysis for supervised dimension reduction

  • Regular Paper
  • Published:
Knowledge and Information Systems Aims and scope Submit manuscript

Abstract

Dimension reduction has always been a major problem in many applications of machine learning and pattern recognition. In this paper, the scaling cut criterion-based supervised dimension reduction methods for data analysis are proposed. The scaling cut criterion can eliminate the limit of the hypothesis that data distribution of each class is homoscedastic Gaussian. To obtain a more reasonable mapping matrix and reduce the computational complexity, local scaling cut criterion-based dimension reduction is raised, which utilized the localization strategy of the input data. The localized \(k\)-nearest neighbor graph is introduced , which relaxes the within-class variance and enlarges the between-class margin. Moreover, by kernelizing the scaling cut criterion and local scaling cut criterion, both methods are extended to efficiently model the nonlinear variability of the data. Furthermore, the optimal dimension scaling cut criterion is proposed, which can automatically select the optimal dimension for the dimension reduction methods. The approaches have been tested on several datasets, and the results have shown a better and efficient performance compared with other linear and nonlinear dimension reduction techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Abbasnejad ME, Ramachandram D, Mandava R (2012) A survey of the state of the art in learning the kernels. Knowl Inf Syst 31(2):193–221

    Article  Google Scholar 

  2. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396

    Article  MATH  Google Scholar 

  3. Borg I, Groenen P (2003) Modern multidimensional scaling: theory and applications. J Educ Meas 40(3):277–280

    Article  Google Scholar 

  4. Ding C, He X, Zha H, Gu M, Simon HD (2001) A min-max cut algorithm for graph partitioning and data clustering. In: Proceedings of the IEEE international conference on data mining, pp 107–114

  5. Fiedler M (1973) Algebraic connectivity of graphs. Czechoslov Math J 23(2):298–305

    MathSciNet  Google Scholar 

  6. Fisher Ronald A (1936) The use of multiple measurements in taxonomic problems. Ann Eugen 7(2):179–188

    Article  Google Scholar 

  7. Fu Y, Li Z, Yuan J et al (2008) Locality versus globality: query-driven localized linear models for facial image computing. IEEE Trans Circuits Syst Video Technol 18(12):1741–1752

    Article  Google Scholar 

  8. Hagen L, Kahng AB (1992) New spectral methods for ratio cut partitioning and clustering. IEEE Trans Comput Aided Des Integr Circuits Syst 11(9):1074–1085

    Article  Google Scholar 

  9. Ham J, Chen Y, Crawford MM et al (2005) Investigation of the random forest framework for classification of hyperspectral data. IEEE Trans Geosci Remote Sens 43(3):492–501

    Article  Google Scholar 

  10. He X, Cai D, Yan S et al (2005) Neighborhood preserving embedding. In: Proceedings of the 10th international conference on computer vision, pp 1208–1213

  11. Jia Y, Nie F, Zhang C (2009) Trace ratio problem revisited. IEEE Trans Neural Netw 20(4):729–735

    Article  Google Scholar 

  12. Li XR, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(2):157–165

    Article  MathSciNet  Google Scholar 

  13. Melgani F, Bruzzone L (2004) Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans Geosci Remote Sens 42(8):1778–1790

    Article  Google Scholar 

  14. Nie F, Xiang S, Zhang C (2007) Neighborhood minmax projections. In: Proceedings of the 20th international joint conference on artificial intelligence, pp 993–998

  15. Nie F, Xiang S, Song Y et al (2009) Orthogonal locality minimizing globality maximizing projections for feature extraction. Opt Eng 48(1):017202–017205

    Article  Google Scholar 

  16. Nie F, Xiang S, Song Y et al (2007) Optimal dimensionality discriminant analysis and its application to image recognition. In: Proceedings of IEEE conference on computer vision and pattern recognition, pp 1–8

  17. Niyogi P, He X (2003) Locality preserving projections. Neural Inf Process Syst 16:153

    Google Scholar 

  18. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  19. Schlkopf B, Smola A, Mller KR (1997) Kernel principal component analysis. In: Proceedings of the 7th international conference on artificial neural networks, vol 97, pp 583–588

  20. Shi J, Malik J (2000) Normalized cuts and image segmentation. IEEE Trans Pattern Anal Mach Intell 22(8):888–905

    Article  Google Scholar 

  21. Song G, Cui B, Zheng B et al (2009) Accelerating sequence searching: dimensionality reduction method. Knowl Inf Syst 20(3):301–322

    Article  Google Scholar 

  22. Sugiyama M (2006) Local Fisher discriminant analysis for supervised dimensionality reduction. In: Proceedings of the 23rd international conference on machine learning, pp 905–912

  23. Tenenbaum Joshua B, De Silva (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323

    Article  Google Scholar 

  24. Wang H, Yan S, Xu D et al (2007) Trace ratio vs. ratio trace for dimensionality reduction. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1–8

  25. Wold S, Esbensen K, Geladi P (1987) Principal component analysis. Chemom Intell Lab Syst 2(1):37–52

    Article  Google Scholar 

  26. Xiang S, Nie F, Song Y et al (2009) Embedding new data points for manifold learning via coordinate propagation. Knowl Inf Syst 19(2):159–184

    Article  Google Scholar 

  27. Xiang S, Nie F, Zhang C et al (2009) Nonlinear dimensionality reduction with local spline embedding. IEEE Trans Knowl Data Eng 21(9):1285–1298

    Article  Google Scholar 

  28. Yan S, Xu D, Zhang B et al (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29(1):40–51

    Article  Google Scholar 

  29. Yang J, Frangi AF, Zhang D et al (2005) KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition. IEEE Trans Pattern Anal Mach Intell 27(2):230–244

    Article  Google Scholar 

  30. Zhang C, Nie F, Xiang S (2010) A general kernelization framework for learning algorithms based on kernel PCA. Neurocomputing 73(4):959–967

    Article  Google Scholar 

  31. Zhang L, Chen S, Qiao L (2012) Graph optimization for dimensionality reduction with sparsity constraints. Pattern Recognit 45(3):1205–1210

    Article  MATH  Google Scholar 

  32. Zhang L, Qiao L, Chen S (2010) Graph-optimized locality preserving projections. Pattern Recognit 43(6):1993–2002

    Article  MATH  Google Scholar 

  33. Zhang XR, Jiao LC, Zhou S et al (2012) Adaptive multi-parameter spectral feature analysis for SAR target recognition. Opt Eng 51(8):087203. doi:10.1117/1.OE.51.8.087203

    Google Scholar 

  34. Zhang XR, Wang WW, Li YY et al (2012) A PSO-based automatic relevance determination and feature selection system for hyperspectral image classification. IET J Electron Lett 48(20):1263–1264

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the National Basic Research Program of China (973 Program) (Grant 2013CB329402), the National Natural Science Foundation of China (Nos. 61272282, 61203303, and 61272279), the Program for New Century Excellent Talents in University (NCET-13-0948), and Fundamental Research Funds for the Central Universities (Grant K50511020011).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xiangrong Zhang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhang, X., He, Y., Jiao, L. et al. Scaling cut criterion-based discriminant analysis for supervised dimension reduction. Knowl Inf Syst 43, 633–655 (2015). https://doi.org/10.1007/s10115-014-0744-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10115-014-0744-0

Keywords

Navigation