Skip to main content
Log in

Fast attribute reduction by neighbor inconsistent pair selection for dynamic decision tables

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Attribute reduction is capable of reducing the dimensionality of data and improving the performance of data mining. As a reasonable representative of relationships between samples, neighbor inconsistent pair focuses on measuring uncertainty in information systems. Nevertheless, classical attribute reduction methods are static and unsuitable for data with variations. Additionally, it is inevitable for data to undergo changes in real-life scenarios, such as an increase in the number of samples. Therefore, it is essential to identify an efficient method for reducing the dimensionality of the dataset while preserving the classification accuracy. Inspired by these deficiencies, our focus lies on developing effective and efficient incremental methods that employ the neighbor inconsistent pair selection strategy for decision tables involving object variations. At first, some concepts related to rough sets, simplified decision tables and neighbor inconsistent pairs are introduced. Then, the heuristic attribute reduction algorithms for dynamic decision tables with the variation of object sets are designed by neighbor inconsistent pairs. Next, a novel feature selection procedure, which we refer to as incremental neighbor inconsistent pair selection, is proposed to update reducts for dynamic decision tables with the variation of object sets. Finally, two incremental attribute reduction algorithms based on neighbor inconsistent pair selection are designed. Furthermore, experiments are conducted on real datasets to validate the effectiveness and benefits of the proposed incremental algorithms. The results indicate that our algorithms exhibit minimal computing time requirements while achieving the highest classification accuracy among at least ten out of thirteen datasets when compared to the comparative algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data availability

The data that support the findings of this study are available from the corresponding author upon reasonable request.

References

  1. Pawlak Z (1982) Rough sets. Int J Comput Inform Sci 11(5):341–356

    Google Scholar 

  2. Li ZW, Dai JH, Chen JL, Fujita H (2020) Measures of uncertainty based on gaussian kernel for a fully fuzzy information system. Knowl-Based Syst 196:105791

    Google Scholar 

  3. Sun L, Wang LY, Ding WP, Qian YH, Xu JC (2021) Feature selection using fuzzy neighborhood entropy-based uncertainty measures for fuzzy neighborhood multigranulation rough sets. IEEE Trans Fuzzy Syst 29(1):19–33

    Google Scholar 

  4. Zhao B, Ren Y, Gao DK (2019) Prediction of service life of large centrifugal compressor remanufactured impeller based on clustering rough set and fuzzy bandelet neural network. Appl Soft Comput 78:132–140

    Google Scholar 

  5. Wang R, Wang XZ, Kwong S, Xu C (2017) Incorporating diversity and informativeness in multiple-instance active learning. IEEE Trans Fuzzy Syst 25(6):1460–1475

    Google Scholar 

  6. Deepa N, Ganesan K (2019) Decision-making tool for crop selection for agriculture development. Neural Comput Appl 31(4):1215–1225

    Google Scholar 

  7. Roy S, Maji P (2020) Medical image segmentation by partitioning spatially constrained fuzzy approximation spaces. IEEE Trans Fuzzy Syst 28(5):965–977

    Google Scholar 

  8. Cheruku R, Edla DR, Kuppili V, Dharavath R (2018) RST-BatMiner: a fuzzy rule miner integrating rough set feature selection and bat optimization for detection of diabetes disease. Appl Soft Comput 67:764–780

    Google Scholar 

  9. Su CH (2017) A novel hybrid learning achievement prediction model: A case study in gamification education applications (APPs). Int J Inform Technol Decis Mak 16(2):515–543

    Google Scholar 

  10. Zhou F, Jiao JR, Yang XJ, Lei BY (2017) Augmenting feature model through customer preference mining by hybrid sentiment analysis. Expert Syst Appl 89:306–317

    Google Scholar 

  11. Ko YC, Fujita H, Li TR (2017) An evidential analysis of Altman Z-score for financial predictions: Case study on solar energy companies. Appl Soft Comput 52:748–759

    Google Scholar 

  12. Lei L (2018) Wavelet neural network prediction method of stock price trend based on rough set attribute reduction. Appl Soft Comput 62:923–932

    Google Scholar 

  13. Liu XM, Shen C, Wang W, Guan XH (2020) CoEvil: a coevolutionary model for crime inference based on fuzzy rough feature selection. IEEE Trans Fuzzy Syst 28(5):806–817

    Google Scholar 

  14. Zhang CC, Dai JH, Chen JL (2020) Knowledge granularity based incremental attribute reduction for incomplete decision systems. Int J Mach Learn Cybern 11:1141–1157

    Google Scholar 

  15. Xu WH, Huang M, Jiang ZY, Qian YH (2023) Graph-based unsupervised feature selection for interval-valued information system. IEEE Trans Neural Netw Learn Syst. https://doi.org/10.1109/TNNLS.2023.3263684

    Article  PubMed  Google Scholar 

  16. Su YB, Guo J, Li ZJ (2015) A simple fitness function for minimum attribute reduction. Comput Intell Neurosci 2015:1–6

    Google Scholar 

  17. Fan J, Jiang YL, Liu Y (2017) Quick attribute reduction with generalized indiscernibility models. Inf Sci 397:15–36

    Google Scholar 

  18. Lazo-Cortes MS, Martinez-Trinidad JF, Carrasco-Ochoa JA, Sanchez Diaz G (2016) A new algorithm for computing reducts based on the binary discernibility matrix. Intell Data Anal 20(2):317–337

    Google Scholar 

  19. Konecny J (2017) On attribute reduction in concept lattices: methods based on discernibility matrix are outperformed by basic clarification and reduction. Inf Sci 415:199–212

    Google Scholar 

  20. Konecny J, Krajca P (2018) On attribute reduction in concept lattices: experimental evaluation shows discernibility matrix based methods inefficient. Inf Sci 467:431–445

    Google Scholar 

  21. Wang CZ, He Q, Shao MW, Hu QH (2018) Feature selection based on maximal neighborhood discernibility. Int J Mach Learn Cybern 9(11):1929–1940

    Google Scholar 

  22. Dai JH, Tian HW (2013) Entropy measures and granularity measures for set-valued information systems. Inf Sci 240(11):72–82

    MathSciNet  Google Scholar 

  23. Dai JH, Xu Q (2013) Attribute selection based on information gain ratio in fuzzy rough set theory with application to tumor classification. Appl Soft Comput 13(1):211–221

    ADS  Google Scholar 

  24. Xu WH, Yuan KH, Li WT, Ding WP (2023) An emerging fuzzy feature selection method using composite entropy-based uncertainty measure and data distribution. IEEE Trans Emerg Top Comput Intell 7(1):76–88

    Google Scholar 

  25. Dai JH, Hu QH, Hu H, Huang DB (2018) Neighbor inconsistent pair selection for attribute reduction by rough set approach. IEEE Trans Fuzzy Syst 26(2):937–950

    Google Scholar 

  26. Bhattacharya A, Goswami RT, Mukherjee K (2019) A feature selection technique based on rough set and improvised PSO algorithm (PSORS-FS) for permission based detection of android malwares. Int J Mach Learn Cybern 10(7):1893–1907

    Google Scholar 

  27. Liu KY, Yang XB, Yu HL, Mi JS, Wang PX, Chen XJ (2019) Rough set based semi-supervised feature selection via ensemble selector. Knowl-Based Syst 165:282–296

    Google Scholar 

  28. Wang CZ, Huang Y, Shao MW, Fan XD (2019) Fuzzy rough set-based attribute reduction using distance measures. Knowl-Based Syst 164:205–212

    Google Scholar 

  29. Dai JH, Hu H, Wu WZ, Qian YH, Huang DB (2018) Maximal-discernibility-pairs-based approach to attribute reduction in fuzzy rough sets. IEEE Trans Fuzzy Syst 26(4):2174–2187

    Google Scholar 

  30. Li FC, Jin CX, Yang JN (2019) Roughness measure based on description ability for attribute reduction in information system. Int J Mach Learn Cybern 10(5):925–934

    Google Scholar 

  31. Chen Y, Wang PX, Yang XB, Mi JS, Liu D (2021) Granular ball guided selector for attribute reduction. Knowl-Based Syst 229:107326

    Google Scholar 

  32. Guo DD, Jiang CM, Sheng RX, Liu SS (2022) A novel outcome evaluation model of three-way decision: a change viewpoint. Inf Sci 607:1089–1110

    Google Scholar 

  33. Yuan KH, Xu WH, Li WT, Ding WP (2022) An incremental learning mechanism for object classification based on progressive fuzzy three-way concept. Inf Sci 584:127–147

    Google Scholar 

  34. Xu WH, Guo DD, Qian YH, Ding WP (2022) Two-way concept-cognitive learning method: a fuzzy-based progressive learning. IEEE Trans Fuzzy Syst. https://doi.org/10.1109/TFUZZ.2022.3216110

    Article  Google Scholar 

  35. Liang JY, Wang F, Dang CY, Qian YH (2014) A group incremental approach to feature selection applying rough set technique. IEEE Trans Knowl Data Eng 26(2):294–308

    Google Scholar 

  36. Yang YY, Chen DG, Wang H (2017) Active sample selection based incremental algorithm for attribute reduction with rough sets. IEEE Trans Fuzzy Syst 25(4):825–838

    Google Scholar 

  37. Yang YY, Chen DG, Wang H (2018) Incremental perspective for feature selection based on fuzzy rough sets. IEEE Trans Fuzzy Syst 26(3):1257–1273

    Google Scholar 

  38. Ma FM, Ding MW, Zhang TF (2019) Compressed binary discernibility matrix based incremental attribute reduction algorithm for group dynamic data. Neurocomputing 294:1–17

    Google Scholar 

  39. Zhang X, Mei CL, Chen DG, Yang YY, Li JH (2020) Active incremental feature selection using a fuzzy-rough-set-based information entropy. IEEE Trans Fuzzy Syst 28(5):901–915

    Google Scholar 

  40. Zhang XY, Li JR (2023) Incremental feature selection approach to interval-valued fuzzy decision information systems based on \(\lambda\)-fuzzy similarity selfinformation. Inf Sci 625:593–619

    Google Scholar 

  41. Jing YG, Li TR, Huang JF, Zhang YY (2016) An incremental attribute reduction approach based on knowledge granularity under the attribute generalization. Int J Approx Reason 76:80–95

    MathSciNet  Google Scholar 

  42. Wang F, Liang JY, Dang CY (2013) Attribute reduction for dynamic data sets. Appl Soft Comput 13(1):676–689

    CAS  Google Scholar 

  43. Jing YG, Li TR, Huang JF, Chen HM, Horng SJ (2017) A group incremental reduction algorithm with varying data values. Int J Intell Syst 32(9):900–925

    Google Scholar 

  44. Wei W, Wu XY, Liang JY, Cui JB, Sun YJ (2018) Discernibility matrix based incremental attribute reduction for dynamic data. Knowl-Based Syst 140:142–157

    Google Scholar 

  45. Jing YG, Li TR, Fujita H, Wang BL, Cheng N (2018) An incremental attribute reduction method for dynamic data mining. Inf Sci 465:202–218

    MathSciNet  Google Scholar 

  46. Xu WH, Pan YZ, Chen XW, Ding WP, Qian YH (2022) A novel dynamic fusion approach using information entropy for interval-valued ordered datasets. IEEE Trans Big Data. https://doi.org/10.1109/TBDATA.2022.3215494

    Article  Google Scholar 

  47. Xu ZY, Liu ZP, Yang BR, Song W (2006) Quick attribute reduction algorithm with complexity of max \((O(|C||U|), O(|C|^2|U/C|))\). Chin J Comput 29(3):391–399

    Google Scholar 

  48. Shu WH, Qian WB (2015) An incremental approach to attribute reduction from dynamic incomplete decision systems in rough set theory. Data Knowl Engi 100:116–132

    Google Scholar 

  49. Jing YG, Li TR, Luo C, Horng SJ, Wang GY, Yu Z (2016) An incremental approach for attribute reduction based on knowledge granularity. Knowl-Based Syst 104:24–38

    Google Scholar 

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (61976089), the Major Program of the National Social Science Foundation of China (20 &ZD047), the Natural Science Foundation of Hunan Province (2021JJ30451, 2022JJ30397), and the Hunan Provincial Science & Technology Project Foundation (2018RS3065, 2018TP1018).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jianhua Dai.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, C., Liu, H., Lu, Z. et al. Fast attribute reduction by neighbor inconsistent pair selection for dynamic decision tables. Int. J. Mach. Learn. & Cyber. 15, 739–756 (2024). https://doi.org/10.1007/s13042-023-01931-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-023-01931-5

Keywords

Navigation