Skip to main content
Log in

Neighborhood attribute reduction: a multi-criterion approach

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Though attribute reduction defined by neighborhood decision error rate can improve the classification performance of neighborhood classifier via deleting redundant attributes, such reduction does not take the variations of classification results into account. To fill this gap, a multi-criterion based attribute reduction is proposed, which considers both neighborhood decision error rate and neighborhood decision consistency. The neighborhood decision consistency is used to measure the variations of classification results if attributes change. Following the novel attribute reduction, a heuristic algorithm is also designed to derive reduct which aims to obtain less error rate and higher consistency simultaneously. The experimental results on 10 UCI data sets show that the multi-criterion based reduction can not only improve the decision consistencies without decreasing the classification accuracies significantly, but also bring us more stable reducts. This study suggests new trends concerning criteria and constraints in attribute reduction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Azam N, Yao JT (2014) Game-theoretic rough sets for recommender systems. Knowl Based Syst 72:96–107

    Article  Google Scholar 

  2. Chen HM, Li TR, Luo C, Wang GY (2015) A decision-theoretic rough set approach for dynamic data mining. IEEE Trans Fuzzy Syst 23:1–14

    Article  Google Scholar 

  3. Chen Y (2016) An adjustable multigranulation fuzzy rough set. Int J Mach Learn Cybern 7:1–8

    Article  Google Scholar 

  4. Cheng Y (2017) Dynamic maintenance of approximations under fuzzy rough sets. Int J Mach Learn Cybern. https://doi.org/10.1007/s13042-017-0683-7

  5. Dash M, Liu H (2003) Consistency-based search in feature selection. Artif Intell 151:155–176

    Article  MathSciNet  MATH  Google Scholar 

  6. Daoud EA (2015) An efficient algorithm for finding a fuzzy rough set reduct using an improved harmony search. Int J Modern Educ Comput Sci 7:16–23

    Article  Google Scholar 

  7. Dou HL, Yang XB, Song XN, Yu HL, Wu WZ (2016) Decision-theoretic rough set: a multicost strategy. Knowl Based Syst 91:71–83

    Article  Google Scholar 

  8. Guo YW, Jiao LC, Wang S, Wang S, Liu F, Rong KX, Xiong T (2014) A novel dynamic rough subspace based selective ensemble. Pattern Recognit 48:1638–1652

    Article  Google Scholar 

  9. Hu QH, Pedrycz W, Yu DR, Liang J (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern Part B (Cybernetics). 40:137–150

    Article  Google Scholar 

  10. Hu QH, Yu DR, Xie ZX (2008) Neighborhood classifiers. Expert Syst Appl 34:866–876

    Article  Google Scholar 

  11. Hu QH, Yu DR, Xie ZX, Li XD (2007) EROS: ensemble rough subspaces. Pattern Recognit 40:3728–3739

    Article  MATH  Google Scholar 

  12. Ju HR, Li HX, Yang XB, Huang B (2017) Cost-sensitive rough set: a multi-granulation approach. Knowl Based Syst 123:137–153

    Article  Google Scholar 

  13. Ju HR, Yang XB, Yu H, Li TJ, Yu DJ, Yang JY (2016) Cost-sensitive rough set approach. Inf Sci 355–356:282–298

    Article  Google Scholar 

  14. Ju HR, Yang XB, Song XN (2014) Dynamic updating multigranulation fuzzy rough set: approximations and reducts. Int J Mach Learn Cybern 5:981–990

    Article  Google Scholar 

  15. Korytkowski M, Rutkowski L, Scherer R (2015) Fast image classification by boosting fuzzy classifiers. Inf Sci 327:175–182

    Article  MathSciNet  Google Scholar 

  16. Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207

    Article  MATH  Google Scholar 

  17. Li JH, Kumar CA, Mei CL, Wang XZ (2017) Comparison of reduction in formal decision contexts. Int J Approx Reason 80:100–122

    Article  MathSciNet  MATH  Google Scholar 

  18. Li SQ, Harner EJ, Adjeroh DA (2011) Random KNN feature selection-a fast and stable alternative to random forests. BMC Bioinf 12:1–11

    Article  Google Scholar 

  19. Mi JS, Wu WZ, Zhang WX (2004) Approaches to knowledge reduction based on variable precision rough set model. Inf Sci 159:255–272

    Article  MathSciNet  MATH  Google Scholar 

  20. Min F, He HP, Qian YH, Zhu W (2011) Test-cost-sensitive attribute reduction. Inf Sci 181:4928–4942

    Article  Google Scholar 

  21. Pawlak Z (1982) Rough sets. Int J Comput Inf Sci 11:342–356

    Article  MATH  Google Scholar 

  22. Qian YH, Liang JY, Pedrycz W, Dang CY (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618

    Article  MathSciNet  MATH  Google Scholar 

  23. Sneath P, Sokal R (1975) Numerical taxonomy. J Geol 193:855–860

    Google Scholar 

  24. Sim J, Wright CC (2005) The kappa statistic in reliability studies: use, interpretation, and sample size requirements. Phys Ther 85:257–268

    Google Scholar 

  25. Skalak DB (1996) The sources of increased accuracy for two proposed boosting algorithms. American Association for Artificial Intelligence, Integrating Multiple Learned MODELS Workshop 120–125

  26. Tohka J, Moradi E, Huttunen H (2016) Comparison of feature selection techniques in machine learning for anatomical brain MRI in dementia. Neuroinformatics. 14:1–18

    Article  Google Scholar 

  27. Tsang ECC, Hu QH, Chen DG (2016) Feature and instance reduction for PNN classifiers based on fuzzy rough sets. Int J Mach Learn Cybern 7:1–11

    Article  Google Scholar 

  28. Wang CZ, Qi YL, Shao MW, Hu QH, Qian YH, Chen DG, Lin YJ (2016) A fitting model for feature selection with fuzzy rough sets. IEEE Transactions on Fuzzy Systems. https://doi.org/10.1109/TFUZZ.2016.2574918

  29. Wang CZ, Shao MW, He Q, Qian YH, Qi YL (2016) Feature subset selection based on fuzzy neighborhood rough sets. Knowl Based Syst 111:173–179

    Article  Google Scholar 

  30. Wang CZ, Hu QH, Wang XZ, Chen DG, Qian YH (2017) Feature selection based on neighborhood discrimination index. IEEE Transactions on Neural Networks and Learning Systems. https://doi.org/10.1109/TNNLS.2017.2710422

  31. Wang H, Jing XJ, Niu B (2017) A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl Based Syst 126:8–19

    Article  Google Scholar 

  32. Wang H, Niu B (2017) A novel bacterial algorithm with randomness control for feature selection in classification. Neurocomputing 228:176–186

    Article  Google Scholar 

  33. Xu SP, Yang XB, Yu HL, Tsang ECC (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61

    Article  Google Scholar 

  34. Xu J, Xie SL, Zhu WK (2017) Marginal patch alignment for dimensionality reduction. Soft Comput 21:2347–2356

    Article  Google Scholar 

  35. Xu J, Gu ZH, Xie K (2016) Fuzzy local mean discriminant analysis for dimensionality reduction. Neural Process Lett 44:701–718

    Article  Google Scholar 

  36. Yang XB, Qi Y, Yu HL, Yang JY (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69

    Article  Google Scholar 

  37. Yang XB, Zhang M, Dou HL, Yang JY (2011) Neighborhood systems-based rough sets in incomplete information system. Knowl Based Syst 24:858–867

    Article  Google Scholar 

  38. Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inf Sci 418:601–618

    Article  Google Scholar 

  39. Yule GU (1900) On the association of attributes in statistics. Philos Trans R Soc A: Math Phys Eng Sci 194:257–319

    Article  MATH  Google Scholar 

  40. Zhai JH, Zhang SF, Wang CX (2017) The classification of imbalanced large data sets based on MapReduce and ensemble of ELM classifiers. Int J Mach Learn Cybern 8:1009–1017

    Article  Google Scholar 

  41. Zhao H, Wang P, Hu QH (2016) Cost-sensitive feature selection based on adaptive neighborhood granularity with multi-level confidence. Inf Sci 366:134–149

    Article  MathSciNet  Google Scholar 

  42. Zhang X, Mei CL, Chen DG, Li JH (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recognit 56:1–15

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the Natural Science Foundation of China (Nos. 61572242, 61503160, 61772273, 61502211, 61471182), Qing Lan Project of Jiangsu Province of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xibei Yang.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, J., Yang, X., Song, X. et al. Neighborhood attribute reduction: a multi-criterion approach. Int. J. Mach. Learn. & Cyber. 10, 731–742 (2019). https://doi.org/10.1007/s13042-017-0758-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-017-0758-5

Keywords

Navigation