Skip to main content
Log in

Semi-supervised dimensionality reduction via sparse locality preserving projection

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

The dimensionality reduction of the unbalanced semi-supervised problem is difficult because there are too few labeled samples. In this paper, we propose a new dimensionality reduction method for the unbalanced semi-supervised problem, called sparse locality preserving projection (SLPP for short). In the past work of solving the semi-supervised dimensionality reduction problems, they either abandon some unlabeled samples or do not utilize the implicit discriminant information of unlabeled samples. While, SLPP learns the optimal projection matrix with the full use of the discriminant information and the geometric structure of the unlabeled samples. Here, we preserve the geometric structure of the rest unlabeled samples and their k-nearest neighbors after increasing the number of labeled samples by label propagation. The optimization problem of SLPP can be easily solved by a generalized eigenvalue problem. Results on various data sets from UCI machine learning repository and two hyperspectral data sets demonstrate that SLPP is superior to other conventional reduction methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Jain AK, Duin RPW, Mao J (2002) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 27(11):1502–1502

    Google Scholar 

  2. Jain A, Duin R, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37

    Article  Google Scholar 

  3. Thomsen E (2010) Dimensionality reduction. US, US 7797320 B2

  4. Hastie T, Buja A, Tibishirani R (1995) Penalized discriminant analysis. Ann Stat 23(1):73C102

    Article  MathSciNet  Google Scholar 

  5. Webb AR, and Copsey KD (1990) Introduction to Statistical Pattern Recognition. Academic Press

  6. Yan S et al. (2007) Graph Embedding And Extension: A General Framework For Dimensionality Reduction. IEEE Transactions on Pattern Analysis & Machine Intelligence 29.1:40

  7. Baudat G, Anouar F (2000) Generalized discriminant analysis using a kernel approach. Neural Comput 12(10):2385–2404

    Article  Google Scholar 

  8. Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):157–165

    Article  Google Scholar 

  9. Jolliffe I (2004) Principal Component Analysis, 2nd ed., Springer Series in Statistics

  10. Bishop C (2006) Pattern Recognition and Machine Learning, Springer Series on Information Science and Statistics

  11. Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. Adv Neural Inf Proces Syst 14(6):585–591

    Google Scholar 

  12. Borg I (2009) Modern Multidimensional Scaling: Theory and Applications

  13. He X, Niyogi P (2004) Locality preserving projections. Proc Neural Inf Process Syst 16:153

    Google Scholar 

  14. Wang R, Nie F, Hong R, Chang X, Yang X, Yu W (2017) Fast and orthogonal locality preserving projections for dimensionality reduction. IEEE Trans Image Process 26(10):5019C5030

    MathSciNet  MATH  Google Scholar 

  15. Qiao L, Chen S, Tan X (2010) Sparsity preserving projections with applications to face recognition, Pattern Recognit. 43 (January (1)) 331C341

    Article  Google Scholar 

  16. Zhang, Daoqiang, Zhou, Zhi-Hua, Chen, Songcan. (2007). Semi-supervised dimensionality reduction. SIAM Data Mining https://doi.org/10.1137/1.9781611972771.73

  17. Wei J, Peng H (2008) Neighborhood preserving based semi-supervised dimensionality reduction. Electron Lett 44(20):1190–1191

    Article  Google Scholar 

  18. Baghshah MS, Shouraki SB (2009) Semi-supervised metric learning using pairwise constraints. International Joint Conference on Ijcai DBLP

  19. Cai D, He X, Han J (2007) Semi-supervised Discriminant Analysis. IEEE, International Conference on Computer Vision IEEE, 1–7

  20. Costa JA, Hero AO Classification constrained dimensionality reduction. IEEE International Conference on Acoustics, Speech, and Signal Processing, 2005. Proceedings IEEE, 2005: v/1077-v/1080 Vol.5

  21. Wang S, Lu J, Sugiyama M, Ide T, Nakajima S, Sese J (2010) Semi-supervised local fisher discriminant analysis for dimensionality reduction. Mach Learn 78(1–2):35C61

    MathSciNet  Google Scholar 

  22. Sugiyama M, Ide T, Nakajima S, Sese J (2010) Semi-supervised local fisher discriminant analysis for dimensionality reduction. Mach Learn 78(1–2):35C61

    MathSciNet  Google Scholar 

  23. Lewis RJ et al (2014) Data clustering: 50 years beyond K-means. Pattern Recogn Lett 31(8):651–666

    Google Scholar 

  24. Antoniak CE (1974) Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. Ann Stat 2(6):1152–1174

    Article  MathSciNet  Google Scholar 

  25. Ferguson TS (1973) A Bayesian analysis of some nonparametric problems, Ann.Stat 209C230

  26. Rasmussen CE (2000) The infinite Gaussian mixture model, Adv. Neural Inf. Process Syst. 12 554C560

  27. Chen P et al (2017) Semi-supervised double sparse graphs based discriminant analysis for dimensionality reduction. Pattern Recogn 61:361–378

    Article  Google Scholar 

  28. Wu H et al. (2017) Semi-Supervised Dimensionality Reduction of Hyperspectral Imagery using Pseudo-Labels. Pattern Recognition 74

    Article  Google Scholar 

  29. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J Royal Stat Soc Ser B 58(1):267C288

    MathSciNet  MATH  Google Scholar 

  30. Chen SS, Donoho DL, Saunders MA (1998) Atomic decomposition by basis pursuit. SIAM J Sci Comput 20(1):33C61

    Article  MathSciNet  Google Scholar 

  31. Xu ZB, Zhang H, Wang Y, Chang XY (2010) L 1/2 regularization. Sci. China 53(6):1159C1169

    Google Scholar 

  32. Chartrand R, Staneva V (2008) Restricted isometry properties and non-convex compressive sensing. Inverse Problems 24(3):20C35

    Article  Google Scholar 

  33. Chartrand R (2009) Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data. IEEE International Conference on Symposium on Biomedical Imaging, Boston, Usa 262–265

  34. Chartrand, Rick, and W. Yin. Iteratively reweighted algorithms for compressive sensing. IEEE International Conference on Acoustics, Speech and Signal Processing IEEE, 2008:38693872

  35. Chartrand R (2007) Exact reconstruction of sparse signals via nonconvex minimization. IEEE Signal Process Lett 14(10):707–710

    Article  Google Scholar 

  36. Foucart S, Lai MJ (2009) Sparsest solutions of underdetermined linear systems via [formula omitted]-minimization for [formula omitted][J]. Appl Comput Harmon Anal 26(3):395–407

    Article  MathSciNet  Google Scholar 

  37. Zou H (2006) The adaptive lasso and its Oracle properties[J]. J Ind Manag Optim 101(476):1418–1429

    MathSciNet  MATH  Google Scholar 

  38. Chartrand R (2007) Exact reconstruction of sparse signals via nonconvex minimization[J]. IEEE Signal Process Lett 14(10):707–710

    Article  Google Scholar 

Download references

Acknowledgements

This paper is supported by National Natural Science Foundation of China (No. 11301535, No.11371365).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Junyan Tan.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, H., Zou, H. & Tan, J. Semi-supervised dimensionality reduction via sparse locality preserving projection. Appl Intell 50, 1222–1232 (2020). https://doi.org/10.1007/s10489-019-01574-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-019-01574-6

Keywords

Navigation