Skip to main content
Log in

Unsupervised feature selection based on self-representation sparse regression and local similarity preserving

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Feature selection, as an indispensable method of data preprocessing, has attracted the attention of researchers. In this paper, we propose a new feature selection model called unsupervised feature selection based on self-representation sparse regression and local similarity preserving, i.e., UFSRL. Specifically, UFSRL is sparse reconstruction of the original data itself, rather than fitting low-dimensional embedding, and the manifold learning exerted on UFSRL model to preserve the local similarity of the data. Moreover, the l2,1/2-matrix norm has been imposed on the coefficient matrix, which make the proposed model sparse and robust to noise. In order to solve the proposed model, we design an effective iterative algorithm, and present the analysis of its convergence. Extensive experiments on eight synthetic and real-world data-sets are conducted, and the results of UFSRL compared with six corresponding feature selection algorithms. The experimental results show that UFSRL can effectively identify the feature subset with discriminative while reconstructing the data sparsely, and it is superior to some unsupervised feature selection algorithms in clustering performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Gu B, Sheng VS, Tay KY, Romano W, Li S (2015) Incremental support vector learning for ordinal regression. IEEE Trans Neural Netw Learn Syst 26(7):1403–1416

    Article  MathSciNet  Google Scholar 

  2. Gu B, Sun XM, Sheng VS (2016) Structural minimax probability machine. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2016.2544779

    Google Scholar 

  3. Tian Q, Chen S (2017) Cross-heterogeneous-database age estimation through correlation representation learning. Neurocomputing 238:286–295

    Article  Google Scholar 

  4. Mutch J, Lowe DG (2006) Multiclass object recognition with sparse localized features. In: Proceedings IEEE computer society conference on computer vision pattern recognit, pp 11–18

  5. Li Z, Tang J (2015) Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Trans Image Process 24(12):5343–5355

    Article  MathSciNet  MATH  Google Scholar 

  6. Gu B, Sheng VS (2016) A robust regularization path algorithm for ν-support vector classification. IEEE Trans Neural Netw Learn Syst 28(5):1241–1248

    Article  Google Scholar 

  7. Zhu YY, Liang JW, Chen JY, Ming Z (2017) An improved NSGA-III algorithm for feature selection used in intrusion detection. Knowl Based Syst 116:74–85

    Article  Google Scholar 

  8. Tang V, Yan H (2012) Noise reduction in microarray gene expression data based on spectral analysis. Int J Mach Learn Cyber 3(1):51–57

    Article  Google Scholar 

  9. Gu B, Sheng VS, Wang Z, Ho D, Osman S, Li S (2015) Incremental learning for ν-support vector regression. Neural Netw 67:140–150

    Article  MATH  Google Scholar 

  10. Wang H, Jing XJ, Niu B (2017) A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl Based Syst 126:8–19

    Article  Google Scholar 

  11. Wang H, Niu B (2017) A novel bacterial algorithm with randomness control for feature selection in classification. Neurocomputing 228:176–186

    Article  Google Scholar 

  12. Sharma A, Imoto S, Miyano S, Sharma V (2012) Null space based feature selection method for gene expression data. Int J Mach Learn Cybern 3(4):269–276

    Article  Google Scholar 

  13. Xiang S, Nie F, Meng G, Pan C, Zhang C (2012) Discriminative least squares regression for multiclass classification and feature selection. IEEE Trans Neutral Netw Learn Syst 23(11):1738–1754

    Article  Google Scholar 

  14. Hu Q, Pan W, An S, Ma P, Wei J (2010) An efficient genes election technique for cancer recognition based on neighborhood mutual information. Int J Mach Learn Cybern 1(1):63–74

    Article  Google Scholar 

  15. Yu SQ, Chen HF, Wang Q, Shen LL, Huang YZ (2017) Invariant feature extraction for gait recognition using only one uniform model. Neurocomputing 239:81–93

    Article  Google Scholar 

  16. Wan MH, Lai ZH (2017) Feature extraction via sparse difference embedding (SDE). KSII Trans Internet Inf Syst 11(7):3594–3607

    Google Scholar 

  17. MartõÂnez AM, Kak AC (2001) PCA versus LDA. IEEE Trans Pattern Anal Mach Intell 23(3):228–233

    Article  Google Scholar 

  18. Tao D, Tang X, Li X, Wu X (2006) Asymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrieval. IEEE Trans Pattern Anal Mach Intell 28(7):1088–1099

    Article  Google Scholar 

  19. Gui J, Sun Z, Ji S, Tao D, Tan T (2016) Feature selection based on structured sparsity: a comprehensive study. IEEE Trans Neutral Netw Learn Syst 28(7):1490–1507

    Article  MathSciNet  Google Scholar 

  20. Peng H, Long F, Ding C (2005) Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  21. Xu J, Yang G, Man H, He H (2013) L 1 graph based on sparse coding for feature selection. In: Proceedings of international symposium on neural networks (ISNN), pp 594–601

  22. Yang JB, Ong C-J (2012) Feature selection based on sparse imputation. In: Proceedings of international joint conference on neural networks (IJCNN), pp 1–7

  23. Weston J, Mukherjee S, Chapelle O, Pontil M, Poggio T, Vapnik V (2000) Feature selection for SVMs. In: Proceedings of advances in neural information processing system, vol 12. Cambridge, pp 526–532

  24. Duda RO, Hart PE, Stork DG (2001) Pattern classification, 2nd edn. Wiley, Hoboken

    MATH  Google Scholar 

  25. Gu Q, Li Z, Han J (2011) Generalized Fisher score for feature selection. In: Proceedings of 27th conference on uncertainty in artificial intelligence, pp 266–273

  26. Guyon I, Weston J, Barnhill S, Vapnik V (2002) Gene selection for cancer classification using support vector machines. Mach Learn 46:389–422

    Article  MATH  Google Scholar 

  27. Liu HW, Sun JG, Liu L, Zhang HJ (2009) Feature selection with dynamic mutual information. Pattern Recog 42(7):1330–1339

    Article  MATH  Google Scholar 

  28. Martínez Sotoca J, Pla F (2010) Supervised feature selection by clustering using conditional mutual information-based distances. Pattern Recog 43(6):2068–2081

    Article  MATH  Google Scholar 

  29. Ma ZG, Nie FP, Yang Y, Uijlings JRR, Sebe N (2012) Web image annotation via subspace-sparsity collaborated feature selection. IEEE Trans Multimed 14(4):1021–1030

    Article  Google Scholar 

  30. Zhu X, Ghahramani Z, Lafferty JD (2003) Semi-supervised learning using gaussian fields and harmonic functions. In: Proceedings of 20th international conference machine learning, pp 912–919

  31. Xu ZL, King IW, Lyu MR, Jin R (2010) Discriminative semi-supervised feature selection via manifold regularization. IEEE Trans Neural Netw 21(7):1033–1047

    Article  Google Scholar 

  32. Liu Y, Nie FP, Wu JG, Chen LH (2010) Semi-supervised feature selection based on label propagation and subset selection. In: Proceedings of ICCIA, pp 293–296

  33. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the16th ACM SIGKDD international conference on knowledge discovery and data mining, pp 333–342

  34. Tang JL, Liu H (2012) Unsupervised feature selection for linked social media data. In: Proceedings of KDD, pp 904–912

  35. Li ZC, Yang Y, Liu J, Zhou XF, Lu HQ (2012) Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of AAAI, pp 1026–1032

  36. Xiang S, Shen X, Ye J (2015) Efficient nonconvex sparse group feature selection via continuous and discrete optimization. Artif Intell 224:28–50

    Article  MathSciNet  MATH  Google Scholar 

  37. Xie Z, Xu Y (2014) Sparse group lasso based uncertain feature selection. Int J Mach Learn Cybern 5(2):201–210

    Article  Google Scholar 

  38. Cong Y, Wang S, Liu J, Cao J, Yang Y, Luo J (2015) Deep sparse feature selection for computer aided endoscopy diagnosis. Pattern Recognit 48(3):907–917

    Article  Google Scholar 

  39. He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. Adv Neural Inf Process Syst 18:507–514

    Google Scholar 

  40. Foucart S, Lai MJ (2008) The sparest solutions of underdetermined linear system by lq-minimization for 0 < q ≤ 1. Appl Comput Harmonic Anal 26(3):395–407

    Article  Google Scholar 

  41. Chartrand R (2009) Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data. In: Proceedings of IEEE international symposium on biomedical imaging, pp 262–265

  42. Nie FP, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint L 2,1-norms minimization. In: Proceedings of NIPS, pp 1813–1821

  43. Wang L, Chen S, Wang Y (2014) A unified algorithm for mixed l 2,p-minimizations and its application in feature selection. Comput Optim Appl 58(2):409–421

    Article  MathSciNet  MATH  Google Scholar 

  44. Shi CJ, Ruan QQ, An GY, Zhao RZ (2015) Hessian semi-supervised sparse feature selection based on L 2,1/2-matrix norm. IEEE Trans Mutimed 17(1):16–28

    Google Scholar 

  45. Zhu P, Zuo W, Zhang L, Hu Q, Shiu SCK (2015) Unsupervised feature selection by regularized self-representation. Pattern Recognit 48:438–446

    Article  MATH  Google Scholar 

  46. Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of 24th international conference on machine learning, pp 1151–1158

  47. Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of 24th AAAI conference on artificial intelligence, pp 673–678

  48. Hou C, Nie F, Li X, Yi D, Wu Y (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44(6):793–804

    Article  Google Scholar 

  49. Fang X, Xu Y, Li X, Fan Z, Liu H, Chen Y (2014) Locality and similarity preserving embedding for feature selection. Neurocomputing 128:304–315

    Article  Google Scholar 

  50. Shang R, Zhang Z, Jiao L, Liu C, Li Y (2016) Self-representation based dual-graph regularized feature selection clustering. Neurocomputing 171:1242–1253

    Article  Google Scholar 

  51. Yan H, Yang J, Yang JY (2016) Robust Joint feature weights learning framework. IEEE Trans Knowl Data Eng 28(5):1327–1339

    Article  Google Scholar 

  52. Zhao Z, He XF, Cai D, Zhang LJ, Ng W, Zhuang YT (2016) Graph regularized feature selection with data reconstruction. IEEE Trans Knowl Data Eng 28(3):689–700

    Article  Google Scholar 

  53. Lee DD, Seung HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401:788–791

    Article  MATH  Google Scholar 

  54. Liu H, Wu Z, Li X, Cai D, Huang TS (2012) Constrained nonnegative matrix factorization for imagine representation. IEEE Trans Pattern Anal Mach Intell 34(7):1299–1311

    Article  Google Scholar 

  55. Papadimitriou C, Steiglitz K (1998) Combinatorial optimization: algorithms and complexity. Dover, New York

    MATH  Google Scholar 

  56. Gibbons J, Dickinson, Chakraborti S (2011) Nonparametric statistical inference. Springer, Berlin

    Book  MATH  Google Scholar 

Download references

Acknowledgements

This work was partially supported by the National Natural Science Foundation of China, under Grants 61773304 and 61371201, the National Basic Research Program (973 Program) of China under Grant 2013CB329402, the Program for Cheung Kong Scholars and Innovative Research Team in University under Grant IRT_15R53.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ronghua Shang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shang, R., Chang, J., Jiao, L. et al. Unsupervised feature selection based on self-representation sparse regression and local similarity preserving. Int. J. Mach. Learn. & Cyber. 10, 757–770 (2019). https://doi.org/10.1007/s13042-017-0760-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-017-0760-y

Keywords

Navigation