Skip to main content
Log in

Unsupervised attribute reduction: improving effectiveness and efficiency

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Attribute reduction has shown its effectiveness in improving the performance of classifiers. Different from widely studied supervised attribute reduction, unsupervised attribute reduction faces great challenges from two main aspects: performance requirement and computationally demanding. Therefore, both effectiveness of selected attributes and efficiency of searching qualified reduct are addressed in the problem solving of unsupervised attribute reduction. Firstly, an ensemble selector is introduced into forward greedy searching. The objective is to identify more suitable attribute for each iteration in the process of searching. Secondly, both sample and attribute based acceleration mechanisms are introduced into our ensemble selector. The first stage is used to derive reduct with better performance, and the second stage is used to speed up the procedure of searching. Finally, our approach is compared with several well-established attribute reductions over 16 UCI datasets. The comprehensive experiments clearly validate the superiorities of our study from the perspectives of both effectiveness and efficiency.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Breiman L, Friedman J, Olshen RA, (1984) Stone CJ Classification and regression trees, Chapman and Hall/CRC

  2. Ba J, Liu KY, Ju HR, Xu SP, Xu TH, Yang XB (2022) Triple-G: a new MGRS and attribute reduction. Int J Mach Learn Cybernet 13:337–356

    Article  Google Scholar 

  3. Chen Y, Liu KY, Song JJ, Fujita H, Yang XB, Qian YH (2020) Attribute group for attribute reduction. Inform Sci. 535:64–80

    Article  MATH  Google Scholar 

  4. Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2:2157–6904

    Article  Google Scholar 

  5. Chen Y, Wang PX, Yang XB, Mi JS, Liu D (2021) Granular ball guided selector for attribute reduction. Knowl-Based Syst. 229:107326

    Article  Google Scholar 

  6. Chen DG, Yang YY, Dong Z (2016) An incremental algorithm for attribute reduction with variable precision rough sets. Appl Soft Comput. 45:129–149

    Article  Google Scholar 

  7. Cheng Y (2017) Dynamic maintenance of approximations under fuzzy rough sets. Int J Mach Learn Cybernet 9:2011–2026

    Article  Google Scholar 

  8. Chen Z, Liu KY, Yang XB, Fujita H (2022) Random sampling accelerator for attribute reduction. Int J Approximate Reason. 140:75–91

    Article  MathSciNet  MATH  Google Scholar 

  9. Dong LJ, Chen DG, Wang NL, Lu ZH (2020) Key energy-consumption feature selection of thermal power systems based on robust attribute reduction with rough sets. Inform Sci. 532:61–71

    Article  Google Scholar 

  10. Guyon I, Weston J, Barnhill S, Vapnik V (2002) Gene selection for cancer classification using support vector machines. Mach. Learn 46:389–422

    Article  MATH  Google Scholar 

  11. Gao C, Zhou J, Miao DQ, Yue XD, Wan J (2021) Granular-conditional-entropy-based attribute reduction for partially labeled data with proxy labels. Artificial Intell. 580:111–128

    MathSciNet  Google Scholar 

  12. Hu QH, Yu DR, Liu JF, Wu CX (2008) Neighborhood rough set based heterogeneous feature subset selection. Inform Sci. 178:3577–3594

    Article  MathSciNet  MATH  Google Scholar 

  13. Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Expert Syst Appl. 34:866–876

    Article  Google Scholar 

  14. Jiang ZH, Liu KY, Yang XB, Yu HL, Fujita H, Qian YH (2020) Accelerator for supervised neighborhood based attribute reduction. Int J Approx Reason. 119:122–150

    Article  MathSciNet  MATH  Google Scholar 

  15. Jia XY, Shang L, Zhou B, Yao YY (2016) Generalized attribute reduct in rough set theory. Knowl-Based Syst. 91:204–218

    Article  Google Scholar 

  16. Ju HR, Wang PX, Yang XB, Fujita H, Xu SP (2021) Robust supervised rough granular description model with the principle of justifiable granularity. Appl Soft Comput. 110:107612

    Article  Google Scholar 

  17. Jiang GX, Wang WJ (2017) Markov cross-validation for time series model evaluations. Inform Sci. 375:219–233

    Article  MATH  Google Scholar 

  18. Ju HR, Yang XB, Song XN, Qi YS (2014) Dynamic updating multigranulation fuzzy rough set: approximations and reducts. Int J Mach Learn Cybernet. 5:981–990

    Article  Google Scholar 

  19. Ju HR, Yang XB, Yu HL, Li TJ, Yu DJ, Yang JY (2016) Cost-sensitive rough set approach. Inform Sci. 355–356:282–298

    Article  MATH  Google Scholar 

  20. Lang GM, Cai MJ, Fujita H, Xiao QM (2018) Related families-based attribute reduction of dynamic covering decision information systems. Knowl-Based Syst. 162:161–173

    Article  Google Scholar 

  21. D’eer L, Cornelis C, Yao YY (2016) A semantically sound approach to Pawlak rough sets and covering-based rough sets. Int J Approx Reason. 78:62–72

    Article  MathSciNet  MATH  Google Scholar 

  22. Li ZJ, Liao B, Cai LJ, Chen M, Liu WH (2018) Semi-supervised maximum discriminative local margin for gene selection. Sci Rep. 8:8619

    Article  Google Scholar 

  23. Liu KY, Li TR, Yang XB, Yang X, Liu D, Zhang PF, Wang J (2022) Granular cabin: an efficient solution to neighborhood learning in big data. Inform Sci. 583:189–201

    Article  Google Scholar 

  24. D’eer L, Restrepo M, Cornelis C, Gómez J (2016) Neighborhood operators for covering-based rough sets. Inform Sci. 336:21–44

    Article  MATH  Google Scholar 

  25. Li Y, Si J, Zhou GJ, Huang SS, Chen SC (2014) FREL: a stable feature selection algorithm. IEEE Trans Neural Netw Learn Syst. 26:1388–1402

    Article  MathSciNet  Google Scholar 

  26. Liang JY, Wang F, Dang CY, Qian YH (2014) A group incremental approach to feature selection applying rough set technique. IEEE Trans Knowl Data Eng. 26:294–308

    Article  Google Scholar 

  27. Liu KY, Yang X, Yu H, Fujita H, Chen X, Liu D (2020) Supervised information granulation strategy for attribute reduction. Int J Mach Learn Cybernet. 11:2149–2163

    Article  Google Scholar 

  28. Liu KY, Yang XB, Fujita H, Liu D, Yang X, Qian YH (2019) An efficient selector for multi-granularity attribute reduction. Inform Sci. 505:457–472

    Article  Google Scholar 

  29. Liu KY, Yang XB, Yu HL, Mi JS, Wang PX, Chen XJ (2019) Rough set based semi-supervised feature selection via ensemble selector. Knowl. Based Syst. 165:282–296

    Article  Google Scholar 

  30. Li JH, Liu ZM (2020) Granule description in knowledge granularity and representation. Knowl-Based Syst. 203:106160

    Article  Google Scholar 

  31. Min F, Liu FL, Wen LY, Zhang ZH (2019) Tri-partition cost-sensitive active learning through kNN. Soft Comput. 23:1557–1572

    Article  Google Scholar 

  32. Ni P, Zhao SY, Wang XZ, Chen H, Li CP (2019) PARA: a positive-region based attribute reduction accelerator. Inform Sci 503:533–550

    Article  Google Scholar 

  33. Pang QQ, Zhang L (2021) A recursive feature retention method for semi-supervised feature selection. Int J Mach Learn Cybernet. 12:2639–2657

    Article  Google Scholar 

  34. Qian YH, Liang JY, Dang CY (2009) Knowledge structure knowledge granulation and knowledge distance in a knowledge base. Int J Approx Reason. 50:174–188

    Article  MathSciNet  MATH  Google Scholar 

  35. Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artificial Intell. 174:597–618

    Article  MathSciNet  MATH  Google Scholar 

  36. Qian YH, Liang XY, Pedrycz W (2011) An efficient accelerator for attribute reduction from incomplete data in rough set framework. Pattern Recognit. 44:1658–1670

    Article  MATH  Google Scholar 

  37. Qian YH, Wang Q, Cheng HH, Liang JY, Dang CY (2015) Fuzzy-rough feature selection accelerator. Fuzzy Sets Syst. 258:61–78

    Article  MathSciNet  MATH  Google Scholar 

  38. Qian YH, Liang XY, Wang Q, Liang JY, Liu B, Skowron A, Yao YY, Ma J, Dang CY (2018) Local rough set: a solution to rough data analysis in big data. Int J Approx Reason. 97:38–63

    Article  MathSciNet  MATH  Google Scholar 

  39. Roffo G, Melzi S, Castellani U, Vinciarelli A, Cristani M (2021) Infinite feature selection: a graph-based feature filtering approach. IEEE Trans Pattern Anal Mach Intell. 43:4396–4410

    Article  Google Scholar 

  40. Rao XS, Yang XB, Yang X, Chen XJ, Liu D, Qian YH (2020) Quickly calculating reduct: an attribute relationship based approach. Knowl Based Syst. 200:106014

    Article  Google Scholar 

  41. Wang CZ, Huang Y, Shao MW, Hu QH, Chen DG (2019) Feature selection based on neighborhood self-information. IEEE Trans Cybernet. 50:2168–2267

    Google Scholar 

  42. Wang PX, Shi H, Yang XB, Mi JS (2019) Three-way k-means: integrating k-means and three-way decision. Int J Mach Learn Cybernet. 10:2767–2777

    Article  Google Scholar 

  43. Wang PX, Yao YY (2018) CE3: a three-way clustering method based on mathematical morphology. Knowl-Based Syst. 155:54–65

    Article  Google Scholar 

  44. Wu WZ, Leung Y (2013) Optimal scale selection for multi-scale decision tables. Int J Approx Reason. 54:1107–1129

    Article  MathSciNet  MATH  Google Scholar 

  45. Wang X, Wang PX, Yang XB, Yao YY (2021) Attribution reduction based on sequential three-way search of granularity. Int J Mach Learn Cybernet. 12:1439–1458

    Article  Google Scholar 

  46. Wang HJ, Zhang YH, Zhang J, Li TR, Peng LX (2019) A factor graph model for unsupervised feature selection. Inform Sci 480:144–159

    Article  MathSciNet  MATH  Google Scholar 

  47. Wang Q, Li JH, Wei L, Qian T (2020) Optimal granule level selection: a granule description accuracy viewpoint. Int J Approx Reason 116:85–105

    Article  MathSciNet  MATH  Google Scholar 

  48. Xu SP, Ju HR, Shang L, Pedrycz W, Yang XB, Li C (2020) Label distribution learning: a local collaborative mechanism. Int J Approx Reason 121:59–84

    Article  MathSciNet  Google Scholar 

  49. Xia SY, Wang C, Wang GY, Ding WP, Gao XB, Yu JH, Zhai YJ, Chen ZZ (2022) A unified granular-ball learning model of Pawlak rough set and neighborhood rough set, arXiv e-prints, arXiv: 2201.03349

  50. Xia SY, Zhang H, Li WH, Wang GY, Giem E, Chen ZZ (2022) GBNRS: a novel rough set algorithm for fast adaptive attribute reduction in classification. IEEE Trans Knowl Data Eng. 34:1231–1242

    Article  Google Scholar 

  51. Xu TH, Wang GY, Yang J (2020) Finding strongly connected components of simple digraphs based on granulation strategy. Int J Approx Reason 118:64–78

    Article  MathSciNet  MATH  Google Scholar 

  52. Yuan Z, Chen HM, Li TR, Yu Z, Sang BB, Luo C (2021) Unsupervised attribute reduction for mixed data based on fuzzy rough sets. Inform Sci. 572:67–87

    Article  MathSciNet  Google Scholar 

  53. Yang XB, Qi Y, Yu HL, Song XN, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl.-Based Syst. 64:59–69

    Article  Google Scholar 

  54. Yang XB, Yao YY (2018) Ensemble selector for attribute reduction. Appl Soft Comput 70:1–11

    Article  Google Scholar 

  55. Yang XB, Zhang M, Dou HL, Yang JY (2011) Neighborhood systems based rough sets in incomplete information system. Knowl-Based Syst 24:858–867

    Article  Google Scholar 

  56. Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inform Sci 418:601–618

    Article  MATH  Google Scholar 

  57. Yao YY (2016) A triarchic theory of granular computing. Granular Comput. 1:145–157

    Article  Google Scholar 

  58. Yan MY, Li JH (2022) Knowledge discovery and updating under the evolution of network formal contexts based on three-way decision. Inform Sci. 601:18–38

    Article  Google Scholar 

  59. Zhao JD, Lu K, He XF (2008) Locality sensitive semi-supervised feature selection. Neurocomputing 71:1842–1849

    Article  Google Scholar 

  60. Zhang PF, Li T, Yuan Z, Luo C, Wang GQ, Liu J, Du SD (2022) A data-level fusion model for unsupervised attribute selection in multi-source homogeneous data. Inform. Fus. 80:87–103

    Article  Google Scholar 

  61. Zhu PF, Xu Q, Hu QH, Zhang CQ (2018) Co-regularized unsupervised feature selection. Neurocomputing 275:2855–2863

    Article  Google Scholar 

  62. Zhang WY, Wei ZW, Wang BH, Han XP (2016) Measuring mixing patterns in complex networks by Spearman rank correlation coefficient. Physica A 451:440–450

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the Natural Science Foundation of China (Nos. 62076111, 62006128, 62006099, 61906078), the Key Laboratory of Oceanographic Big Data Mining & Application of Zhejiang Province (No. OBDMA202002, No. OBDMA202104), and the Natural Science Foundation of Jiangsu Provincial Colleges and Universities (No. 20KJB520010).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taihua Xu.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gong, Z., Liu, Y., Xu, T. et al. Unsupervised attribute reduction: improving effectiveness and efficiency. Int. J. Mach. Learn. & Cyber. 13, 3645–3662 (2022). https://doi.org/10.1007/s13042-022-01618-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-022-01618-3

Keywords

Navigation