Skip to main content
Log in

A Feature Selection Algorithm Based on Equal Interval Division and Conditional Mutual Information

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The performance of many feature selection algorithms is affected because of ignoring three-dimensional mutual information among features. Three-dimensional mutual information includes conditional mutual information, joint mutual information and three-way interaction information. Aiming at the limitation, this paper investigates feature selection based on three-dimensional mutual information. First, we propose an objective function based on conditional mutual information. Further, we propose a criterion to validate whether the objective function can guarantee the effectiveness of selected features. In the case that the objective function cannot guarantee the effectiveness of selected features, we combine a method of equal interval division and ranking with the objective function to select features. Finally, we propose a feature selection algorithm named EID-CMI. To validate the performance of EID-CMI, we compare it with several feature selection algorithms. Experimental results demonstrate that EID-CMI can achieve better feature selection performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Bolon CV, Sanchez MN, Alonso BA (2015) Recent advances and emerging challenges of feature selection in the context of big data. Knowl-Based Syst 86:33–45

    Article  Google Scholar 

  2. Borja SP, Veronica BC, Amparo AB (2017) Testing different ensemble configurations for feature selection. Neural Process Lett 46:1–24

    Article  Google Scholar 

  3. Estevez PA, Tesmer M, Perez CA, Zurada JM (2009) Normalized mutual information feature selection. IEEE Trans Neural Netw 20(2):189–201

    Article  Google Scholar 

  4. Vergara JR, Estevez PA (2014) A review of feature selection methods based on mutual information. Neural Comput Appl 24(1):175–186

    Article  Google Scholar 

  5. Brown G, Pocock A, Zhao MJ, Lujan M (2012) Conditional likelihood maximisation: a unifying framework for information theoretic feature selection. J Mach Learn Res 13:27–66

    MathSciNet  MATH  Google Scholar 

  6. Lewis DD (1992) Feature selection and feature extraction for text categorization. In: Proceedings of the Workshop on speech and natural language, pp 212–217

  7. Battiti R (1994) Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 5(4):537–550

    Article  Google Scholar 

  8. Kwak N, Choi CH (2002) Input feature selection for classification problems. IEEE Trans Neural Netw 13(1):143–159

    Article  Google Scholar 

  9. Peng HC, Long FH, Ding C (2005) Feature selection based on mutual information: criteria of max-dependency, max-relevance and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  10. Foithong S, Pinngern O, Attachoo B (2012) Feature subset selection wrapper based on mutual information and rough sets. Expert Syst Appl 39(1):574–584

    Article  Google Scholar 

  11. Wang ZC, Li MQ, Li JZ (2015) A multi-objective evolutionary algorithm for feature selection based on mutual information with a new redundancy measure. Inf Sci 307:73–88

    Article  MathSciNet  Google Scholar 

  12. Gu XY, Guo JC, Xiao LJ, Ming T, Li CY (2020) A feature selection algorithm based on equal interval division and minimal-redundancy maximal-relevance. Neural Process Lett 51(2):1237–1263

    Article  Google Scholar 

  13. Sun X, Liu YH, Xu MT, Chen HL, Han JW, Wang KH (2013) Feature selection using dynamic weights for classification. Knowl-Based Syst 37:541–549

    Article  Google Scholar 

  14. Zeng ZL, Zhang HJ, Zhang R, Yin CX (2015) A novel feature selection method considering feature interaction. Pattern Recogn 48(8):2656–2666

    Article  Google Scholar 

  15. Yang HH, Moody J (1999) Data visualization and feature selection: new algorithms for non-gaussian data. In: Proceedings of conference on neural information processing systems

  16. Fleuret F (2004) Fast binary feature selection with conditional mutual information. J Mach Learn Res 5:1531–1555

    MathSciNet  MATH  Google Scholar 

  17. Lin DH, Tang X (2006) Conditional infomax learning: an integrated framework for feature extraction and fusion. In: Proceedings of European conference on computer vision, pp 68–82

  18. Bennasar M, Hicks Y, Setchi R (2015) Feature selection using joint mutual information maximisation. Expert Syst Appl 42(22):8520–8532

    Article  Google Scholar 

  19. Vinh NX, Zhou S, Chan J, Bailey J (2016) Can high-order dependencies improve mutual information based feature selection. Pattern Recogn 53:46–58

    Article  Google Scholar 

  20. Jakulin A, Bratko T (2004) Testing the significance of attribute interactions. In: Proceedings of international conference on machine learning, pp 409–416

  21. Yu L, Liu H (2004) Efficient feature selection via analysis of relevance and redundancy. J Mach Learn Res 5:1205–1224

    MathSciNet  MATH  Google Scholar 

  22. Lichman M (2013) UCI Machine learning repository. University of California, Irvine, School of Information and Computer Sciences

  23. Li JD, Cheng KW, Wang SH, Morstatter F, Trevino RP, Tang JL, Liu H (2018) Feature selection: a data perspective. ACM Comput Surv 50(6)

  24. Fayyad U, Irani KB (1993) Multi-interval discretization of continuous-valued attributes for classification learning. In: Proceedings of international joint conference on artificial intelligence, pp 1022–1027

  25. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newsl 11(1):10–18

    Article  Google Scholar 

  26. Zhao Z, Morstatter F, Sharma S, Alelyani S, Anand A, Liu H (2010) Advancing feature selection research. ASU feature selection repository 1–28

  27. Herman G, Zhang B, Wang Y, Ye GT, Chen F (2013) Mutual information-based method for selecting informative feature sets. Pattern Recogn 46(12):3315–3327

    Article  Google Scholar 

  28. Ren JF, Jiang XD, Yuan JS (2015) Learning LBP structure by maximizing the conditional mutual information. Pattern Recogn 48(10):3180–3190

    Article  Google Scholar 

  29. Wang J, Wei JM, Yang ZL, Whang SQ (2017) Feature selection by maximizing independent classification information. IEEE Trans Knowl Data Eng 29(4):828–841

    Article  Google Scholar 

  30. Tang B, Kay S, He HB (2016) Toward optimal feature selection in Naive Bayes for text categorization. IEEE Trans Knowl Data Eng 28(9):2508–2521

    Article  Google Scholar 

  31. Zhang F, Chan PPK, Biggio B, Yeung DS, Roli F (2016) Adversarial feature selection against evasion attacks. IEEE Trans Cybernet 46(3):766–777

    Article  Google Scholar 

  32. Fei T, Kraus D, Zoubir AM (2012) A hybrid relevance measure for feature selection and its application to underwater objects recognition. In: Proceedings of international conference on image processing, pp 97–100

  33. Fei T, Kraus D, Zoubir AM (2015) Contributions to automatic target recognition systems for underwater mine classification. IEEE Trans Geosci Remote Sens 53(1):505–518

    Article  Google Scholar 

  34. Gu XY, Guo JC (2019) A study on Subtractive Pixel Adjacency Matrix features. Multimedia Tools Appl 78(14):19681–19695

    Article  Google Scholar 

  35. Gu XY, Guo JC, Wei HW, He YH (2020) Spatial-domain steganalytic feature selection based on three-way interaction information and KS test. Soft Comput 24(1):333–340

    Article  Google Scholar 

  36. Veronica BC, Noelia SM, Amparo AB (2013) A review of feature selection methods on synthetic data. Knowl Inf Syst 34(3):483–519

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (61771334).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jichang Guo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gu, X., Guo, J., Ming, T. et al. A Feature Selection Algorithm Based on Equal Interval Division and Conditional Mutual Information. Neural Process Lett 54, 2079–2105 (2022). https://doi.org/10.1007/s11063-021-10720-6

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-021-10720-6

Keywords

Navigation