Skip to main content
Log in

A Q-learning approach to attribute reduction

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Attribute reduction is a paradigm covering both theories and techniques of selecting required attributes with constraints related to rough set. Currently, though various searching strategies have been developed for achieving such a purpose, few of them take the rewards of identifying attributes into account. In this study, inspired by the popular reinforcement learning mechanism, a Q-learning based procedure is designed to search qualified attributes and then construct the expected reduct. Specifically, state is regarded as the temporary result of selected attributes, action is regarded as variation of such a temporary result if random strategy is performed. Immediately, the reward can be obtained, which offers guidance on identifying attributes with the greatest reward. Moreover, considering the random factors emerge in our scheme, an ensemble device is also used to further improve classification performance of selected attributes in reduct. Finally, comprehensive experiments over a total of 15 UCI datasets clearly validates the superiorities of our study against 5 state-of-the-art approaches.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Akila S, Christe SA (2022) A wrapper based binary bat algorithm with greedy crossover for attribute selection. Expert Systems with Applications, 187, Article, 115828

  2. Ba J, Liu KY, Ju HR, Xu SP, Xu TH, Yang XB (2022) Triple-g: a new MGRS and attribute reduction. Int J Mach Learn Cybern 13:337–356

    Article  Google Scholar 

  3. Bania RK, Halder A (2020) R-ensembler: a greedy rough set based ensemble attribute selection algorithm with k NN imputation for classification of medical data. Computer Methods and Programs in Biomedicine, 184, Article, 105122

  4. Breiman L, Friedman J, Stone CJ, Olshen RA (1984) Classification and regression trees. Chapman and hall/CRC

  5. Cai MJ, Lang GM, Fujita H, Li ZY, Yang T (2019) Incremental approaches to updating reducts under dynamic covering granularity. Knowl-Based Syst 172:130–140

    Article  Google Scholar 

  6. Cai MJ, Li QG, Lang GM (2017) Shadowed sets of dynamic fuzzy sets. Granular Computing 2:85–94

    Article  Google Scholar 

  7. Cai MJ, Li QG, Ma JM (2017) Knowledge reduction of dynamic covering decision information systems caused by variations of attribute values. Int J Mach Learn Cybern 8:1131–1144

    Article  Google Scholar 

  8. Chantar H, Mafarjia M, Alsawalqah H, Heidari AA, Aljarah I, Faris H (2020) Feature selection using binary grey wolf optimizer with elite-based crossover for arabic text classification. Neural Comput & Applic 32:12201–12220

    Article  Google Scholar 

  9. Chen Y, Wang PX, Yang XB, Mi JS, Liu D (2021) Granular ball guided selector for attribute reduction. Knowledge-Based Systems, 229 Article 107326

  10. Chen Z, Liu KY, Yang XB, Fujita H (2022) Random sampling accelerator for attribute reduction. Int J Approx Reason 140:75–91

    Article  MATH  Google Scholar 

  11. Dhiman G, Kumar V (2019) Seagull optimization algorithm: theory and its applications for large-scale industrial engineering problems. Knowl-Based Syst 165:169–196

    Article  Google Scholar 

  12. Ding WP, Pedrycz W, Triguero I, Cao ZH, Lin CT (2021) Multigranulation supertrust model for attribute reduction. IEEE Trans Fuzzy Syst 29:1395–1408

    Article  Google Scholar 

  13. Hu QH, Yu DR, Liu JF, Wu CX (2008) Neighborhood rough set based heterogeneous feature subset selection. Inf Sci 178:3577–3594

    Article  MATH  Google Scholar 

  14. Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Expert Syst Appl 34:866–876

    Article  Google Scholar 

  15. Hu QH, Yu DR, Xie ZX, Li XD (2007) EROS: Ensemble rough subspaces. Pattern Recogn 40:3728–3739

    Article  MATH  Google Scholar 

  16. Hu QH, An S, Yu X, Yu DR (2011) Robust fuzzy rough classifiers. Fuzzy Sets Syst 183:26–43

    Article  MATH  Google Scholar 

  17. Hu QH, Pedrycz W, Yu DR, Lang J (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern, Part B (Cybernetics) 40:137–150

    Article  Google Scholar 

  18. Iffat AG, Smith LS (2010) Feature subset selection in large dimensionality domains. Pattern Recogn 43:5–13

    Article  MATH  Google Scholar 

  19. Jia XY, Shang L, Zhou B, Yao YY (2016) Generalized attribute reduct in rough set theory. Knowl-Based Syst 91:204–218

    Article  Google Scholar 

  20. Jiang F, Yu X, Zhao HB, Gong DW, Du JW (2021) Ensemble learning based on random super-reduct and resampling. Artif Intell Rev 54:3115–3140

    Article  Google Scholar 

  21. Jiang ZH, Liu KY, Song JJ, Yang XB, Li JH, Qian YH (2021) Accelerator for crosswise computing reduct. Applied Soft Computing, 98 Article 106740

  22. Jiang ZH, Yang XB, Yu HL, Liu D, Wang PX, Qian YH (2019) Accelerator for multi-granularity attribute reduction. Knowl-Based Syst 177:145–158

    Article  Google Scholar 

  23. Ju HR, Wang WP, Yang XB, Fujita H, Xu SP (2021) Robust supervised rough granular description model with the principle of justifiable granularity. Applies Soft Computing, 110 Article 107612

  24. Ju HR, Yang XB, Song XN, Qi YS (2014) Dynamic updating multigranulation fuzzy rough set: approximations and reducts. Int J Mach Learn Cybern 5:981–990

    Article  Google Scholar 

  25. Ju HR, Yang XB, Yu HL, Li TJ, Yu DJ, Yang JY (2016) Cost-sensitive rough set approach. Inf Sci 355-356:282–298

    Article  MATH  Google Scholar 

  26. Lang GM, Cai MJ, Fujita H, Xiao QM (2018) Related families-based attribute reduction of dynamic covering decision information systems. Knowl-Based Syst 162:161–173

    Article  Google Scholar 

  27. Lang GM, Li QG, Cai MJ, Fujita H, Zhang HY (2019) Related families-based methods for updating reducts under dynamic object sets. Knowl Inf Syst 60:1081–1104

    Article  Google Scholar 

  28. Li JZ, Yang XB, Song XN, Li JH, Wang PX, Yu DJ (2019) Neighborhood attribute reduction: a multi-criterion approach. Int J Mach Learn Cybern 10:731–742

    Article  Google Scholar 

  29. Liu KY, Li TR, Yang XB, Yang X, Liu D, Zhang PF, Wang J (2022) Granular cabin: an efficient solution to neighborhood learning in big data. Inf Sci 583:189–201

    Article  Google Scholar 

  30. Liu KY, Yang XB, Fujita H, Liu D, Yang X, Qian YH (2019) An efficient selector for multi-granularity attribute reduction. Inf Sci 505:457–472

    Article  Google Scholar 

  31. Liu KY, Yang XB, Yu HL, Fujita H, Chen XJ, Liu D (2020) Supervised information granulation strategy for attribute reduction. Int J Mach Learn Cybern 11:2149–2163

    Article  Google Scholar 

  32. Li SQ, Harner EJ, Adjeroh DA (2011) Random KNN feature selection - a fast stable alternative to random forests. BMC Bioinformatics, 12, Article, 450

  33. Li Y, Si JN, Zhou GJ, Huang SS, Chen SC (2014) FREL: A stable feature selection algorithm. IEEE Trans Neural Netw Learn Syst 26:1388–1402

    Article  Google Scholar 

  34. Min F, Liu FL, Wen LY, Zhang ZH (2019) Tri-partition cost-sensitive active learning through kNN. Soft Comput 23:1557–1572

    Article  Google Scholar 

  35. Qian YH, Cheng HH, Wang JT, Liang JY, Pedrycz W, Dang CY (2017) Grouping granular structures in human granulation intelligence. Inf Sci 382–383:150–169

    Article  Google Scholar 

  36. Qian YH, Liang JY, Pedrycz W, Dang CY (2011) An efficient accelerator for attribute reduction from incomplete data in rough set framework. Pattern Recogn 44:1658–1670

    Article  MATH  Google Scholar 

  37. Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618

    Article  MATH  Google Scholar 

  38. Richard B (1957) A Markovian decision process. J Math Mech 6:679–684

    MATH  Google Scholar 

  39. Rao XS, Yang XB, Yang X, Chen XJ, Liu D, Qian YH (2020) Quickly calculating reduct: An attribute relationship based approach. Knowledge-Based Systems, 200, Article 106014

  40. Sarkar C, Cooley S, Srivastava J (2014) Robust feature selection technique using rank aggregation. Appl Artif Intell 28:243– 257

    Article  Google Scholar 

  41. Sun L, Wang LY, Ding WP, Qian YH, Xu JC (2021) Feature selection using fuzzy neighborhood entropy-based uncertainty measures for fuzzy neighborhood multigranulation rough sets. IEEE Trans Fuzzy Syst 29:19–33

    Article  Google Scholar 

  42. Sun L, Wang TX, Ding WP, Xu JC, Lin YJ (2021) Feature selection using fisher score and multilabel neighborhood rough sets for multilabel classification. Inf Sci 578:87–912

    Article  Google Scholar 

  43. Sun L, Yin TY, Ding WP, Qian YH, Xu JC (2021) Feature selection with missing labels using multilabel fuzzy neighborhood rough sets and maximum relevance minimum redundancy. IEEE Trans Fuzzy Syst. https://doi.org/10.1109/TFUZZ.2021.3053844

  44. Sun L, Zhang JX, Ding WP, Xu JC (2022) Feature reduction for imbalanced data classification using similarity-based feature clustering with adaptive weighted K-nearest neighbors. Inf Sci 593:591–613

    Article  Google Scholar 

  45. Thuy NN, Wongthanavasu S (2021) A novel feature selection method for high-dimensional mixed decision tables. IEEE Transactions on Neural Networks and Learning Systems. https://doi.org/10.1109/TNNLS.2020.3048080

  46. Wang CZ, Hu QH, Wang XZ, Chen DG, Qian YH, Dong Z (2017) Feature selection based on neighborhood discrimination index. IEEE Trans Neural Netw Learn Syst 29:2986–2999

    Google Scholar 

  47. Wang GG, Deb S, Cui ZH (2019) Monarch butterfly optimization. Neural Comput & Applic 31:1955–2014

    Google Scholar 

  48. Wang WJ, Zhan JM, Zhang C (2021) Three-way decisions based multi-attribute decision making with probabilistic dominance relations. Inf Sci 559:75–96

    Article  MATH  Google Scholar 

  49. Xia SY, Peng DW, Meng DY, Zhang CQ, Wang GY, Giem E, Wei W, Chen ZZ (2022) Ball k-means: fast adaptive clustering with no bounds. IEEE Trans Pattern Anal Mach Intell 44:87–99

    Google Scholar 

  50. Xia SY, Zhang Z, Li WH, Wang GY, Giem E, Chen ZZ (2022) GBNRS: A novel rough set algorithm for fast adaptive attribute reduction in classification. IEEE Trans Knowl Data Eng 34:1231–1242

    Article  Google Scholar 

  51. Xie XJ, Qian XL, Zhou Q, Zhou YH, Zhang T, Janicki R, Zhao W (2019) A novel test-cost-sensitive attribute reduction approach using the binary bat algorithm. Knowledge-Based Systems, 186 Article 104938

  52. Xu SP, Ju HR, Shang L, Pedrycz W, Yang XB, Li C (2020) Label distribution learning: a local collaborative mechanism. Int J Approx Reason 121:59–84

    Article  Google Scholar 

  53. Xu SP, Yang XB, Yu HL, Yu DJ, Yang JY, Tsang ECC (2016) Multi-label learning with label-specific feature reduction. Knowl-Based Syst 104:52–61

    Article  Google Scholar 

  54. Yang XB, Liang SC, Yu HL, Gao S, Qian YH (2019) Pseudo-label neighborhood rough set: measures and attribute reductions. Int J Approx Reason 105:112–129

    Article  MATH  Google Scholar 

  55. Yang XB, Qi Y, Yu HL, Song XN, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl-Based Syst 64:59–69

    Article  Google Scholar 

  56. Yang XB, Yao YY (2018) Ensemble selector for attribute reduction. Appl Soft Comput 70:1–11

    Article  Google Scholar 

  57. Yao YY, Zhao Y, Wang J (2008) On reduct construction algorithms. Transactions on Computational Science II(5150):100–117

    MATH  Google Scholar 

  58. Yuan Z, Chen HM, Li TR, Yu Z, Sang BB, Luo C (2021) Unsupervised attribute reduction for mixed data based on fuzzy rough sets. Inf Sci 572:67–87

    Article  Google Scholar 

  59. Zhang C, Bai WH, Li DY, Zhan JM (2022) Multiple attribute group decision making based on multigranulation probabilistic models, multimoora and tpop in incomplete q-rung orthopair fuzzy information systems. Int J Approx Reason 143:102–120

    Article  MATH  Google Scholar 

  60. Zhang C, Ding JJ, Li DY, Zhan JM (2021) A novel multi-granularity three-way decision making approach in q-rung orthopair fuzzy information systems. Int J Approx Reason 138:161–187

    Article  MATH  Google Scholar 

  61. Zhang C, Li DY, Liang JY (2020) Interval-valued hesitant fuzzy multi-granularity three-way decisions in consensus processes with applications to multi-attribute group decision making. Inf Sci 511:192–211

    Article  MATH  Google Scholar 

  62. Zhang C, Li DY, Liang JY (2020) Multi-granularity three-way decisions with adjustable hesitant fuzzy linguistic multigranulation decision-theoretic rough sets over two universes. Inf Sci 507:665–683

    Article  MATH  Google Scholar 

  63. Zhang X, Mei CL, Chen DG, Li JH (2016) Feature selection in mixed data: a method using a novel fuzzy rough set based information entroy. Pattern Recogn 56:1–15

    Article  MATH  Google Scholar 

  64. Zhang XY, Chen J (2022) Three-hierarchical three-way decision models for conflict analysis: a qualitative improvement and a quantitative extension. Inf Sci 587:485–514

    Article  Google Scholar 

  65. Zhang XY, Fan YR, Yang JL (2021) Feature selection based on fuzzy-neighborhood relative decision entropy. Pattern Recogn Lett 146:100–107

    Article  Google Scholar 

  66. Zhang XY, Gou HY, Lv ZY, Miao DQ (2021) Double-quantitative distance measurement and classification learning based on the tri-level granular structure of neighborhood system Knowledge-Based Systems, 217 Article 106799

  67. Zhang XY, Yao H, Lv ZY, Miao DQ (2021) Class-specific information measures and attribute reducts for hierarchy and systematicness. Inf Sci 563:196–225

    Article  Google Scholar 

  68. Zhang XY, Yao YY (2022) Tri-level attribute reduction in rough set theory. Expert Systems with Applications, 190 Article 116187

  69. Zhao SY, Chen H, Li CP, Du XY, Sun H (2014) A novel approach to building arobust fuzzy rough classifier. IEEE Trans Fuzzy Syst 23:769–786

    Article  Google Scholar 

Download references

Acknowledgment

This work was supported by the Natural Science Foundation of China (Nos. 62076111, 62006128, 62006099, 61906078), the Key Laboratory of Oceanographic Big Data Mining & Application of Zhejiang Province (Nos. OBDMA202104, OBDMA202002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xibei Yang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Y., Gong, Z., Liu, K. et al. A Q-learning approach to attribute reduction. Appl Intell 53, 3750–3765 (2023). https://doi.org/10.1007/s10489-022-03696-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03696-w

Keywords

Navigation