Skip to main content
Log in

Attribute reduction via local conditional entropy

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

In rough set theory, the concept of conditional entropy has been widely accepted for studying the problem of attribute reduction. If a searching strategy is given to find reduct, then the value of conditional entropy can also be used to evaluate the significance of the candidate attribute in the process of searching. However, traditional conditional entropy is used to characterize the relationship between conditional attributes and decision attribute in terms of all samples in data, it does not take such relationship with specific samples (samples with same label) into account. To fill such a gap, a new form of conditional entropy which is termed as Local Conditional Entropy is proposed. Furthermore, based on some important properties about local conditional entropy studied, local conditional entropy based attribute reduction is defined. Immediately, an ensemble strategy is introduced into the heuristic process for searching reduct, which is realized by the significance based on local conditional entropy. Finally, the experimental results over 18 UCI data sets show us that local conditional entropy based attribute reduction is superior to traditional conditional entropy based attribute reduction, the former may provide us attributes with higher classification accuracies. In addition, if local conditional entropy is regarded as the measurement in online feature selection, then it not only offers us better classification performance, but also requires lesser elapsed time to complete the process of online feature selection. This study suggests new trends for considering attribute reduction and provides guidelines for designing new measurements and related algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Pawlak Z (1992) Rough sets: theoretical aspects of reasoning about data. Kluwer Academic Publishers, Poland (ISBN 978-0792314721)

  2. Azam N, Yao JT (2014) Game-theoretic rough sets for recommender systems. Knowl Based Syst 72:96–107. https://doi.org/10.1016/j.knosys.2014.08.030

    Article  Google Scholar 

  3. Chen HM, Li TR, Luo C (2015) A decision-theoretic rough set approach for dynamic data mining. IEEE Trans Fuzzy Syst 23:1–14. https://doi.org/10.1109/TFUZZ.2014.2387877

    Article  Google Scholar 

  4. Dou HL, Yang XB, Song XN (2016) Decision-theoretic rough set: a multicost strategy. Knowl Based Syst 91:71–83. https://doi.org/10.1016/j.knosys.2015.09.011

    Article  Google Scholar 

  5. Hu QH, Pedrycz W, Yu DR (2010) Selecting discrete and continuous features based on neighborhood decision error minimization. IEEE Trans Syst Man Cybern B 40:137–150. https://doi.org/10.1109/TSMCB.2009.2024166

    Article  Google Scholar 

  6. Qian YH, Liang XY, Wang Q (2018) Local rough set: a solution to rough data analysis in big data. Int J Approx Reason 97:38–63. https://doi.org/10.1016/j.ijar.2018.01.008

    Article  MathSciNet  MATH  Google Scholar 

  7. Yang XB, Qian YH, Yang JY (2012) Hierarchical structures on multigranulation spaces. J Comput Sci Technol 27:1169–1183. https://doi.org/10.1007/s11390-012-1294-0

    Article  MathSciNet  MATH  Google Scholar 

  8. Li JH, Cherukuri AK, Mei CL (2017) Comparison of reduction in formal decision contexts. Int J Approx Reason 80:100–122

    Article  MathSciNet  Google Scholar 

  9. Li JH, Mei CL, Lv YJ (2013) Incomplete decision contexts: approximate concept construction, rule acquisition and knowledge reduction. Int J Approx Reason 54(1):149–165

    Article  MathSciNet  Google Scholar 

  10. Zhou P, Hu XG, Li PP (2017) Online feature selection for high-dimensional class-imbalanced data. Knowl Based Syst 136:187–199. https://doi.org/10.1016/j.knosys.2017.09.006

    Article  Google Scholar 

  11. Qian YH, Liang JY, Pedrycz W (2011) An efficient accelerator for attribute reduction from incomplete data in rough set framework. Pattern Recognit 44:1658–1670

    Article  Google Scholar 

  12. Yang XB, Yao YY (2018) Ensemble selector for attribute reduction. Appl Soft Comput 70:1–11. https://doi.org/10.1016/j.asoc.2018.05.013

    Article  Google Scholar 

  13. Min F, He HP, Qian YH (2011) Test-cost-sensitive attribute reduction. Inf Sci 181:4928–4942

    Article  Google Scholar 

  14. Yang YY, Chen DG, Wang H (2018) Incremental perspective for feature selection based on fuzzy rough sets. IEEE Trans Fuzzy Syst 26:1257–1273. https://doi.org/10.1109/TFUZZ.2017.2718492

    Article  Google Scholar 

  15. Tsang ECC, Hu QH, Chen DG (2016) Feature and instance reduction for PNN classfiers based on fuzzy rough sets. Int J Mach Learn Cybern 7:1–11. https://doi.org/10.1007/s13042-014-0232-6

    Article  Google Scholar 

  16. Dai JH, Hu H, Wu WZ (2018) Maximal-discernibility-pair-based approach to attribute reduction in fuzzy rough sets. IEEE Trans Fuzzy Syst 26:2174–2187. https://doi.org/10.1109/TFUZZ.2017.2768044

    Article  Google Scholar 

  17. Wang CZ, Hu QH, Wang XZ (2018) Feature selection based on neighborhood discrimination index. IEEE Trans Neural Netw Learn Syst 29:2986–2999. https://doi.org/10.1109/TNNLS.2017.2710422

    Article  MathSciNet  Google Scholar 

  18. Liao SJ, Zhu QX, Qian YH (2018) Multi-granularity feature selection on cost-sensitive data with measurement errors and variable costs. Knowl Based Syst 158:25–42. https://doi.org/10.1016/j.knosys.2018.05.020

    Article  Google Scholar 

  19. Hu QH, Zhang L, Chen DG (2010) Gaussian kernel based fuzzy rough sets: model, uncertainty measures and applications. Int J Approx Reason 51:453–471. https://doi.org/10.1016/j.ijar.2010.01.004

    Article  MATH  Google Scholar 

  20. Yang XB, Yang JY, Wu C (2008) Dominance-based rough set approach and knowledge reductions in incomplete ordered information system. Inf Sci 178:1219–1234. https://doi.org/10.1016/j.ins.2007.09.019

    Article  MathSciNet  MATH  Google Scholar 

  21. Jia XY, Tang ZM, Liao WH (2014) On an optimization representation of decision-theoretic rough set model. Int J Approx Reason 55:156–166. https://doi.org/10.1016/j.ijar.2013.02.010

    Article  MathSciNet  MATH  Google Scholar 

  22. Yang X, Li TR, Liu D (2017) A unified framework of dynamic three-way probabilistic rough sets. Inf Sci 420:126–147. https://doi.org/10.1016/j.ins.2017.08.053

    Article  MathSciNet  Google Scholar 

  23. Liu D, Li TR, Ruan D (2011) Probabilistic model criteria with decision-theoretic rough sets. Inf Sci 181:3709–3722. https://doi.org/10.1016/j.ins.2011.04.039

    Article  MathSciNet  Google Scholar 

  24. Min F, Hu QH, Zhu W (2014) Feature selection with test cost constraint. Int J Approx Reason 55:167–179. https://doi.org/10.1016/j.ijar.2013.04.003

    Article  MathSciNet  MATH  Google Scholar 

  25. Zhang HR, Min F, Shi B (2017) Regression-based three-way recommendation. Inf Sci 378:444–461

    Article  Google Scholar 

  26. Hu QH, Yu DR, Xie ZX (2006) Fuzzy probabilistic approximation spaces and their information measures. IEEE Trans Fuzzy Syst 14:191–201. https://doi.org/10.1109/TFUZZ.2005.864086

    Article  Google Scholar 

  27. Dai JH, Wang WT, Tian HW (2013) Attribute selection based on a new conditional entropy for incomplete decision systems. Knowl Based Syst 39:207–213. https://doi.org/10.1016/j.knosys.2012.10.018

    Article  Google Scholar 

  28. Qian YH, Liang JY (2008) Combination entropy and combination granulation in rough set theory. Int J Uncertain Fuzzy 16:179–193. https://doi.org/10.1142/S0218488508005121

    Article  MathSciNet  MATH  Google Scholar 

  29. Chen DG, Zhao SY, Zhang L (2012) Sample pair selection for attribute reduction with rough set. IEEE Trans Knowl Data Eng 24:2080–2093. https://doi.org/10.1109/TKDE.2011.89

    Article  Google Scholar 

  30. Yang XB, Qi YS, Song XN (2013) Test cost sensitive multigranulation rough set: model and minimal cost selection. Inf Sci 250:184–199. https://doi.org/10.1016/j.ins.2013.06.057

    Article  MathSciNet  MATH  Google Scholar 

  31. Liu KY, Yang XB, Yu HL et al (2018) Rough set based semi-supervised feature selection via ensemble selector. Knowl Based Syst. https://doi.org/10.1016/j.knosys.2018.11.034

    Article  Google Scholar 

  32. Qian YH, Liang JY, Pedrycz W (2010) Positive approximation: an accelerator for attribute reduction in rough set theory. Artif Intell 174:597–618. https://doi.org/10.1016/j.artint.2010.04.018

    Article  MathSciNet  MATH  Google Scholar 

  33. Chen HM, Li TR, Cai Y (2016) Parallel attribute reduction in dominance-based neighborhood rough set. Inf Sci 373:351–368. https://doi.org/10.1016/j.ins.2016.09.012

    Article  Google Scholar 

  34. Yang XB, Qi Y, Yu HL (2014) Updating multigranulation rough approximations with increasing of granular structures. Knowl Based Syst 64:59–69. https://doi.org/10.1016/j.knosys.2014.03.021

    Article  Google Scholar 

  35. Chen DG, Yang YY, Dong Z (2016) An incremental algorithm for attribute reduction with variable precision rough sets. Appl Soft Comput 45:129–149. https://doi.org/10.1016/j.asoc.2016.04.003

    Article  Google Scholar 

  36. Hu QH, Pan WW, Zhang L (2012) Feature selection for monotonic classification. IEEE Trans Fuzzy Syst 20:69–81. https://doi.org/10.1109/TFUZZ.2011.2167235

    Article  Google Scholar 

  37. Zhang X, Mei CL, Chen DG (2016) Feature selection in mixed data: a method using a novel fuzzy rough set-based information entropy. Pattern Recognit 56:1–15. https://doi.org/10.1016/j.patcog.2016.02.013

    Article  MATH  Google Scholar 

  38. Dai JH, Wei BJ, Zhang XH (2017) Uncertainty measurement for incomplete interval-valued information systems based on \(\alpha\)-weak similarity. Knowl Based Syst 136:159–171. https://doi.org/10.1016/j.knosys.2017.09.009

    Article  Google Scholar 

  39. Hu QH, Che XJ, Zhang L (2012) Rank entropy-based decision trees for monotonic classification. IEEE Trans Knowl Data Eng 24:2052–2064. https://doi.org/10.1109/TKDE.2011.149

    Article  Google Scholar 

  40. Chen DG, Zhao SY (2010) Local reduction of decision system with fuzzy rough sets. Fuzzy Set Syst 161:1871–1883. https://doi.org/10.1016/j.fss.2009.12.010

    Article  MathSciNet  MATH  Google Scholar 

  41. Yang XB, Liang SC, Yu HL (2018) Pseudo-label neighborhood rough set: measures and attribute reductions. Int J Approx Reason 105:112–129. https://doi.org/10.1016/j.ijar.2018.11.010

    Article  MathSciNet  MATH  Google Scholar 

  42. Yao YY, Zhang XY (2017) Class-specific attribute reducts in rough set theory. Inf Sci 418–419:601–618. https://doi.org/10.1016/j.ins.2017.08.038

    Article  Google Scholar 

  43. Xu SP, Yang XB, Yu HL et al (2016) Multi-label learning with label-specific feature reduction. Knowl Based Syst 104:52–61. https://doi.org/10.1016/j.knosys.2016.04.012

    Article  Google Scholar 

  44. Hu XG, Zhou P, Li PP (2017) A survey on online feature selection with streaming features. Front Comput Sci 12:479–493. https://doi.org/10.1007/s11704-016-5489-3

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the Natural Science Foundation of China (Nos. 61502211, 61572242, 61503160). We would like to thank Eric Appiah Mantey and selase Tawiah Kwawu for their help in improving the language quality of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kai Dong.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, Y., Chen, X. & Dong, K. Attribute reduction via local conditional entropy. Int. J. Mach. Learn. & Cyber. 10, 3619–3634 (2019). https://doi.org/10.1007/s13042-019-00948-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-019-00948-z

Keywords

Navigation