Skip to main content
Log in

Gift: granularity over specific-class for feature selection

  • Research Article
  • Published:
Artificial Intelligence Review Aims and scope Submit manuscript

Abstract

As a fundamental material of Granular Computing, information granulation sheds new light on the topic of feature selection. Although information granulation has been effectively applied to feature selection, existing feature selection methods lack the characterization of feature potential. Such an ability is one of the important factors in evaluating the importance of features, which determines whether candidate features have sufficient ability to distinguish different target variables. In view of this, a novel concept of granularity over specific-class from the perspective of information granulation is proposed. Essentially, such a granularity is a fusion of intra-class and extra-class based granularities, which enables to exploit the discrimination ability of features. Accordingly, an intuitive yet effective framework named Gift, i.e., granularity over specific-class for feature selection, is proposed. Comprehensive experiments on 29 public datasets clearly validate the effectiveness of Gift as compared with other feature selection strategies, especially in noisy data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Abukhodair F, Alsaggaf W, Jamal AT, Abdel-Khalek S, Mansour RF (2021) An intelligent metaheuristic binary pigeon optimization-based feature selection and big data classification in a mapreduce environment. Mathematics 9(20):2627

    Google Scholar 

  • Biggio B, Roli F (2018) Wild patterns: ten years after the rise of adversarial machine learning. Pattern Recog 84:317–331

    Google Scholar 

  • Cao FL, Ye HL, Wang DH (2015) A probabilistic learning algorithm for robust modeling using neural networks with random weights. Info Sci 313:62–78

    MATH  Google Scholar 

  • Chen Y, Liu KY, Song JJ, Fujita H, Yang XB, Qian YH (2020) Attribute group for attribute reduction. Info Sci 535:64–80

    MATH  Google Scholar 

  • Demšar J (2006) Statistical comparisons of classifiers over multiple datasets. J Machine Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  • Gao WF, Hao PT, Wu Y, Zhang P (2023) A unified low-order information-theoretic feature selection framework for multi-label learning. Pattern Recog 134

    Google Scholar 

  • Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Exp Syst Appl 34:866–876

    Google Scholar 

  • Hu L, Gao L, Li Y, Zhang P, Gao W (2022) Feature-specific mutual information variation for multi-label feature selection. Info Sci 593:449–471

    Google Scholar 

  • Jiang GX, Wang WJ (2017) Markov cross-validation for time series model evaluations. Info Sci 375:219–233

    MATH  Google Scholar 

  • Kononenko I, Šimec E, Robnik-Šikonja M (1997) Overcoming the myopia of inductive learning algorithms with reliefF. Appl Intell 7:39–55

    Google Scholar 

  • Lang GM, Cai MJ, Fujita H, Xiao QM (2018) Related families-based attribute reduction of dynamic covering decision information systems. Knowl Based Syst 162:161–173

    Google Scholar 

  • Li JH, Ren Y, Mei CL, Qian YH, Yang XB (2016) A comparative study of multigranulation rough sets and concept lattices via rule acquisition. Knowl Based Syst 91:152–164

    Google Scholar 

  • Li F, Miao DQ, Pedrycz W (2017) Granular multi-label feature selection based on mutual information. Pattern Recog 67:410–423

    Google Scholar 

  • Liang JY, Shi ZZ (2004) The information entropy, rough entropy and knowledge granulation in rough set theory. Int J Uncertain Fuzz Knowl Based Syst 12:37–46

    MathSciNet  MATH  Google Scholar 

  • Liang JY, Wang F, Dang CY, Qian YH (2012) An efficient rough feature selection algorithm with a multi-granulation view. Int J Approx Reason 53:912–926

    MathSciNet  Google Scholar 

  • Lin GF, Kang XB, Liao KY, Chen YJ (2021) Deep graph learning for semi-supervised classification. Pattern Recog 118:108039

    Google Scholar 

  • Liu KY, Yang XB, Yu HL, Mi JS, Wang PX, Chen XJ (2019) Rough set based semi-supervised feature selection via ensemble selector. Knowl Based Syst 165:282–296

    Google Scholar 

  • Liu KY, Yang X, Yu H, Fujita H, Chen X, Liu D (2020) Supervised information granulation strategy for attribute reduction. Int J Machine Learn Cybern 11:2149–2163

    Google Scholar 

  • Liu KY, Li TR, Yang XB, Chen HM, Wang J, Deng ZX (2023) SemiFREE: semi-supervised feature selection with fuzzy relevance and redundancy. IEEE Trans Fuzz Syst. https://doi.org/10.1109/TFUZZ.2023.3255893

    Article  Google Scholar 

  • Min F, Liu FL, Wen LY, Zhang ZH (2019) Tri-partition cost-sensitive active learning through KNN. Soft Comput 23:1557–1572

    Google Scholar 

  • Niu JJ, Chen DG, Li JH, Wang H (2022) A dynamic rule-based classification model via granular computing. Info Sci 584:325–341

    Google Scholar 

  • Qian YH, Liang JY, Dang CY (2009) Knowledge structure, knowledge granulation and knowledge distance in a knowledge base. Int J Approx Reason 50:174–188

    MathSciNet  MATH  Google Scholar 

  • Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618

    MathSciNet  MATH  Google Scholar 

  • Qian YH, Cheng HH, Wang JT, Liang JY, Pedrycz W, Dang CY (2017) Grouping granular structures in human granulation intelligence. Info Sci 382–383:150–169

    Google Scholar 

  • Rao XS, Yang XB, Yang X, Chen XJ, Liu D, Qian YH (2020) Quickly calculating reduct: an attribute relationship based approach. Knowl Based Syst 200:06014

    Google Scholar 

  • San D, Zhang DQ (2010) Bagging constraint score for feature selection with pairwise constraint. Pattern Recog 43:2106–2118

    MATH  Google Scholar 

  • Shu WH, Shen H (2014) Incremental feature selection based on rough set in dynamic incomplete data. Pattern Recog 47:3890–3906

    Google Scholar 

  • Sun L, Wang LY, Ding WP, Qian YH, Xu JC (2021) Feature selection using fuzzy neighborhood entropy-based uncertainty measures for fuzzy neighborhood multigranulation rough sets. IEEE Trans Fuzz Syst 29:19–33

    Google Scholar 

  • Wang CZ, Hu QH, Wang XZ, Chen DG, Qian YH, Dong Z (2018) Feature selection based on neighborhood discrimination index. IEEE Trans Neural Netw Learn Syst 29:2986–2999

    MathSciNet  Google Scholar 

  • Wang CZ, Huang Y, Shao MW, Hu QH (2019) Feature selection based on neighborhood self-information. IEEE Trans Cybern 99:1–12

    Google Scholar 

  • Wu ZB, Mao KZ, Ng GW (2019) Enhanced feature fusion through irrelevant redundancy elimination in intra-class and extra-class discriminative correlation analysis. Neurocomputing 335:105–118

    Google Scholar 

  • Xia SY, Liu YS, Ding X, Wang GY, Yu H, Lu YG (2019) Granular ball computing classififiers for efficient, scalable and robust learning. Info Sci 483:136–152

    Google Scholar 

  • Xu SP, Yang XB, Yu HL, Yu DJ, Yang JY, Tsang ECC (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61

    Google Scholar 

  • Xue Y, Aouari A, Mansour RF, Su SB (2021) A hybrid algorithm based on PSO and GA for feature selection. J Cyber Secur 3(2):117–124

    Google Scholar 

  • Yang XB, Yao YY (2018) Ensemble selector for attribute reduction. Appl Soft Comput 70:1–11

    Google Scholar 

  • Yang XB, Qi Y, Yu HL, Song XN, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69

    Google Scholar 

  • Yang XB, Xu SP, Dou HL, Song XN, Yu HL, Yang JY (2017) Multigranulation rough set: a multiset based strategy. Int J Comput Intell Syst 10:277–292

    Google Scholar 

  • Yang X, Li TR, Liu D, Fujita H (2019) A temporal-spatial composite sequential approach of three-way granular computing. Inform Sci 486:171–189

    Google Scholar 

  • Yao YY, Zhang Y, Wang J (2008) On reduct construction algorithms. Trans Comput Sci 5150:100–117

    MATH  Google Scholar 

  • Zhang X, Mei CL, Chen DG, Li JH (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recog 56:1–15

    MATH  Google Scholar 

  • Zhou P, Hua XG, Li PP, Wu XD (2019) Online streaming feature selection using adapted neighborhood rough set. Info Sci 481:258–279

    Google Scholar 

  • Zhou P, Du L, Li XJ, Shen YD, Qian YH (2020) Unsupervised feature selection with adaptive multiple graph learning. Pattern Recog 105:107375

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Natural Science Foundation of China (Nos. 62076111, 62006128, 62006099, 61906078).

Author information

Authors and Affiliations

Authors

Contributions

JB: Conceptualization, Methodology, Software, Investigation, Writing—Original draft. KL: Formal analysis, Data curation Funding acquisition. XY: Supervision, Resources, Project administration, Validation, Funding acquisition, Writing-Review & Editing. YQ: Formal analysis, Data curation, Software.

Corresponding author

Correspondence to Xibei Yang.

Ethics declarations

Competing interest

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix A

Appendix A

See Table 8.

Table 8 A list of variables

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ba, J., Liu, K., Yang, X. et al. Gift: granularity over specific-class for feature selection. Artif Intell Rev 56, 12201–12232 (2023). https://doi.org/10.1007/s10462-023-10499-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10462-023-10499-z

Keywords

Navigation