Skip to main content
Log in

Feature selection using class-level regularized self-representation

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Feature selection aims at selecting representative features from the original high-dimensional feature set, and it has drawn much attention in most real-world applications like data mining and pattern recognition. This paper studies feature selection problem from the viewpoint of feature self-representation. Traditionally, feature self-representation is only performed on the whole-level reconstruction, whereas the feature selection ability is insufficient owing to the intra-class variations. To address this problem, we propose a new feature selection method, i.e., class-level regularized self-representation (CLRSR). In the proposed method, a class-level reconstruction term is designed to reduce intra-class variations of the samples from different categories. By jointly optimizing the whole-level reconstruction and the class-level reconstruction, CLRSR is able to select more discriminative and informative features. Moreover, an iterative algorithm is proposed to minimize cost function of CLRSR, and its convergence is proven in theory. By comparing with several state-of-the-art feature selection methods, experimental evaluations on six benchmark datasets have verified effectiveness and superiority of CLRSR.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Algorithm 1
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Data Availability

Data and materials will be made available on reasonable request.

Notes

  1. https://jundongl.github.io/scikit-feature/datasets.html

  2. https://jundongl.github.io/scikit-feature/datasets.html

  3. https://jundongl.github.io/scikit-feature/datasets.html

  4. https://jundongl.github.io/scikit-feature/datasets.html

  5. http://archive.ics.uci.edu/ml/datasets.php

  6. http://archive.ics.uci.edu/ml/datasets.php

References

  1. Hu H, Wang R, Nie F, Yang X, Yu W (2018) Fast unsupervised feature selection with anchor graph and L2,1-norm regularization. Multimed Tools Appl 77(17):22099–22113

    Article  Google Scholar 

  2. Zeng N, Wu P, Wang Z, Li H, Liu W, Liu X (2022) A small-sized object detection oriented multi-scale feature fusion approach with application to defect detection. IEEE Trans Instrum Meas 71:1–14

    Google Scholar 

  3. Wu P, Li H, Zeng N, Li F (2022) Fmd-yolo: an efficient face mask detection method for covid-19 prevention and control in public. Image Vis Comput 117:104341

    Article  Google Scholar 

  4. Zeng N, Li H, Peng Y (2021) A new deep belief network-based multi-task learning for diagnosis of alzheimer’s disease. Neural Comput Applic:1–12

  5. Li J, Cheng K, Wang S, Morstatter F, Trevino RP, Tang J, Liu H (2017) Feature selection: a data perspective. ACM Comput Surveys (CSUR) 50(6):1–45

    Article  Google Scholar 

  6. Zhu X, Li X, Zhang S, Ju C, Wu X (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275

    Article  MathSciNet  Google Scholar 

  7. Komeili M, Armanfard N, Hatzinakos D (2021) Multiview feature selection for single-view classification. IEEE Trans Pattern Anal Mach Intell 43(10):3573–3586

    Article  Google Scholar 

  8. Liu N, Lai Z, Li X, Chen Y, Mo D, Kong H, Shen L (2021) Locality preserving robust regression for jointly sparse subspace learning. IEEE Trans Circuits Syst Video Technol 31(6):2274–2287

    Article  Google Scholar 

  9. He X, Niyogi P (2003) Locality preserving projections. Adv Neural Inf Process Syst 16:153–160

    Google Scholar 

  10. Balasubramanian M, Schwartz EL, Tenenbaum JB, de Silva V, Langford JC (2002) The isomap algorithm and topological stability. Science 295(5552):7–7

    Article  Google Scholar 

  11. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  12. Li X, Wang Y, Ruiz R (2022) A survey on sparse learning models for feature selection. IEEE Trans Cybern 52(3):1642–1660

    Article  Google Scholar 

  13. Xu J, Qu K, Meng X, Sun Y, Hou Q (2022) Feature selection based on multiview entropy measures in multiperspective rough set. Int J Intell Syst

  14. Afshar M, Usefi H (2022) Optimizing feature selection methods by removing irrelevant features using sparse least squares. Expert Syst Appl, pp 116928

  15. Nie F, Zhu W, Li X (2021) Structured graph optimization for unsupervised feature selection. IEEE Trans Knowl Data Eng 33(3):1210–1222

    Google Scholar 

  16. Wang S, Wang H (2017) Unsupervised feature selection via low-rank approximation and structure learning. Knowl-Based Syst 124:70–79

    Article  Google Scholar 

  17. Zhang R, Li X (2020) Unsupervised feature selection via data reconstruction and side information. IEEE Trans Image Process 29:8097–8106

    Article  MathSciNet  MATH  Google Scholar 

  18. He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. In: Advances in neural information processing systems, pp 507–514

  19. Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning, pp 1151–1157

  20. Wei Z, Wang Y, He S, Bao J (2017) A novel intelligent method for bearing fault diagnosis based on affinity propagation clustering and adaptive feature selection. Knowl-Based Syst 116:1–12

    Article  Google Scholar 

  21. Xu X, Wu X, Wei F, Zhong W, Nie F (2021) A general framework for feature selection under orthogonal regression with global redundancy minimization. IEEE Trans Knowl Data Eng

  22. Wahid A, Khan DM, Hussain I, Khan SA, Khan Z (2022) Unsupervised feature selection with robust data reconstruction (ufs-rdr) and outlier detection. Expert Syst Appl, p 117008

  23. Wolf L, Shashua A, Geman D (2005) Feature selection for unsupervised and supervised inference: the emergence of sparsity in a weight-based approach. J Mach Learn Res, vol 6(11)

  24. Nie F, Xiang S, Jia Y, Zhang C, Yan S (2008) Trace ratio criterion for feature selection. In: AAAI, vol 2, pp 671–676

  25. Song X-F, Zhang Y, Guo Y-N, Sun X-Y, Wang Y-L (2020) Variable-size cooperative coevolutionary particle swarm optimization for feature selection on high-dimensional data. IEEE Trans Evol Comput 24(5):882–895

    Article  Google Scholar 

  26. Hu Y, Zhang Y, Gong D (2021) Multiobjective particle swarm optimization for feature selection with fuzzy cost. IEEE Trans Cybern 51(2):874–888

    Article  Google Scholar 

  27. Wang R, Bian J, Nie F, Li X (2022) Unsupervised discriminative projection for feature selection. IEEE Trans Knowl Data Eng 34(2):942–953

    Article  Google Scholar 

  28. Zheng W, Chen S, Fu Z, Zhu F, Yan H, Yang J (2021) Feature selection boosted by unselected features. IEEE Trans Neural Netw Learn Syst

  29. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining, pp 333–342

  30. Yang Y, Shen HT, Ma Z, Huang Z, Zhou X (2011) L2,1-norm regularized discriminative feature selection for unsupervised. In: Proceedings of the twenty-second international joint conference on artificial intelligence, pp 1589–1594

  31. Li Z, Yang Y, Liu J, Zhou X, Lu H (2012) Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of the twenty-sixth AAAI conference on artificial intelligence, pp 1026–1032

  32. Lin X, Guan J, Chen B, Zeng Y (2021) Unsupervised feature selection via orthogonal basis clustering and local structure preserving. IEEE Trans Neural Netw Learn Syst

  33. Zhu P, Zuo W, Zhang L, Hu Q, Shiu SC (2015) Unsupervised feature selection by regularized self-representation. Pattern Recogn 48(2):438–446

    Article  MATH  Google Scholar 

  34. Yu H, Wang K, Li Y, Zhao W (2019) Representation learning with class level autoencoder for intelligent fault diagnosis. IEEE Signal Process Lett 26(10):1476–1480

    Article  Google Scholar 

  35. Wu X, Xu X, Liu J, Wang H, Hu B, Nie F (2021) Supervised feature selection with orthogonal regression and feature weighting. IEEE Trans Neural Netw Learn Syst 32(5):1831–1838

    Article  MathSciNet  Google Scholar 

  36. You M, Yuan A, Zou M, jian He D, Li X (2021) Robust unsupervised feature selection via multi-group adaptive graph representation. IEEE Trans Knowl Data Eng

  37. Tang C, Zhu X, Chen J, Wang P, Liu X, Tian J (2018) Robust graph regularized unsupervised feature selection. Expert Syst Appl 96:64–76

    Article  Google Scholar 

  38. Shang R, Wang W, Stolkin R, Jiao L (2018) Non-negative spectral learning and sparse regression-based dual-graph regularized feature selection. IEEE Trans Cybern 48(2):793–806

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported in part by the National Natural Science Foundation of China (No. 41627804), and part by the National Natural Science Foundation of China (No. 41604130). We gratefully thank these projects for their support.

Funding

This work was supported in part by the National Natural Science Foundation of China (No. 41627804), and part by the National Natural Science Foundation of China (No. 41604130).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhenghua Lu.

Ethics declarations

Some declarations about this paper are given as follows:

Competing interests

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Ethics approval and consent to participate

This work does not involve any ethical issues.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lu, Z., Chu, Q. Feature selection using class-level regularized self-representation. Appl Intell 53, 13130–13144 (2023). https://doi.org/10.1007/s10489-022-04177-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-04177-w

Keywords

Navigation