Skip to main content
Log in

A feature selection algorithm based on redundancy analysis and interaction weight

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

The performance of some three-dimensional mutual information-based algorithms can be affected, since only relevance and interaction are considered. Aiming at solving the problem, a feature selection algorithm based on redundancy analysis and interaction weight is proposed in this paper. The proposed algorithm adopts three-way interaction information to measure the interaction among the class label and features, and processes features for interaction weight analysis. Then, it employs symmetric uncertainty to measure the relevance between features and the class label as well as the redundancy between features, and selects the features with greater relevance and interaction as well as smaller redundancy. To validate the performance, the proposed algorithm is compared with several feature selection algorithms. Since relevance, redundancy, and interaction analysis are all presented, the proposed algorithm can obtain better feature selection performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3:1157–1182

    MATH  Google Scholar 

  2. Huang XJ, Zhang L, Wang BJ, Li FZ, Zhang Z (2018) Feature clustering based support vector machine recursive feature elimination for gene selection. Appl Intell 48(3):594–607

    Article  Google Scholar 

  3. Wang YW, Feng LZ, Zhu JM (2018) Novel artificial bee colony based feature selection method for filtering redundant information. Appl Intell 48(4):868–885

    Article  Google Scholar 

  4. Tang B, Kay S, He HB (2016) Toward optimal feature selection in naive bayes for text categorization. IEEE Trans Knowl Data Eng 28(9):2508–2521

    Article  Google Scholar 

  5. Shang CX, Li M, Feng SZ, Jiang QS, Fan JP (2013) Feature selection via maximizing global information gain for text classification. Knowl-Based Syst 54:298–309

    Article  Google Scholar 

  6. Gu XY, Guo JC (2019) A study on subtractive pixel adjacency matrix features. Multimed Tools Appl 78(14):19681–19695

    Article  Google Scholar 

  7. Gu XY, Guo JC, Wei HW, He YH (2020) Spatial-domain steganalytic feature selection based on three-way interaction information and KS test. Soft Comput 24(1):333–340

    Article  Google Scholar 

  8. Zhang F, Chan PPK, Biggio B, Yeung DS, Roli F (2016) Adversarial feature selection against evasion attacks. IEEE Trans Cybern 46(3):766–777

    Article  Google Scholar 

  9. Fei T, Kraus D, Zoubir AM (2015) Contributions to automatic target recognition systems for underwater mine classification. IEEE Trans Geosci Remote Sens 53(1):505–518

    Article  Google Scholar 

  10. Battiti R (1994) Using mutual information for selecting features in supervised neural net learning. IEEE Trans Neural Netw 5(4):537–550

    Article  Google Scholar 

  11. Peng HC, Long FH, Ding C (2005) Feature selection based on mutual information: criteria of max-dependency, max-relevance and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  12. Sun X, Liu YH, Xu MT, Chen HL, Han JW, Wang KH (2013) Feature selection using dynamic weights for classification. Knowl-Based Syst 37:541–549

    Article  Google Scholar 

  13. Zeng ZL, Zhang HJ, Zhang R, Yin CX (2015) A novel feature selection method considering feature interaction. Pattern Recogn 48(8):2656–2666

    Article  Google Scholar 

  14. Estevez PA, Tesmer M, Perez CA, Zurada JA (2009) Normalized mutual information feature selection. IEEE Trans Neural Netw 20(2):189–201

    Article  Google Scholar 

  15. Foithong S, Pinngern O, Attachoo B (2012) Feature subset selection wrapper based on mutual information and rough sets. Expert Syst Appl 39(1):574–584

    Article  Google Scholar 

  16. Jakulin A, Bratko I (2004) Testing the significance of attribute interactions. In: Proceedings of international conference on machine learning, pp 409–416

  17. Bennasar M, Hicks Y, Setchi R (2015) Feature selection using joint mutual information maximisation. Expert Syst Appl 42(22):8520–8532

    Article  Google Scholar 

  18. Wang J, Wei JM, Yang ZL, Wang SQ (2017) Feature selection by maximizing independent classification information. IEEE Trans Knowl Data Eng 29(4):828–841

    Article  Google Scholar 

  19. Gao WF, Hu L, Zhang P, He JL (2018) Feature selection considering the composition of feature relevancy. Pattern Recogn Lett 112:70–74

    Article  Google Scholar 

  20. Gao WF, Hu L, Zhang P (2018) Class-specific mutual information variation for feature selection. Pattern Recogn 79:328–339

    Article  Google Scholar 

  21. Dua D, Graff C (2019) UCI Machine Learning Repository. http://archive.ics.uci.edu/ml

  22. Li JD, Cheng KW, Wang SH, Morstatter F, Trevino RP, Tang JL, Liu H (2018) Feature selection: a data perspective. ACM Comput Surv 50(6):1–45

    Article  Google Scholar 

  23. Fayyad UM, Irani KB (1993) Multi-interval discretization of continuous-valued attributes for classification learning. In: Proceedings of international joint conference on artificial intelligence, pp 1022–1027

  24. Hall MA, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explorations 11(1):10–18

    Article  Google Scholar 

  25. Zhao Z, Morstatter F, Sharma S, Alelyani S, Anand A, Liu H (2010) ASU feature selection software package. http://featureselection.asu.edu/old/index.php

  26. Gu XY, Guo JC, Xiao LJ, Ming T, Li CY (2020) A feature selection algorithm based on equal interval division and minimal-redundancy-maximal-relevance. neural process lett 51(2):1237–1263

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (61771334).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jichang Guo.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gu, X., Guo, J., Li, C. et al. A feature selection algorithm based on redundancy analysis and interaction weight. Appl Intell 51, 2672–2686 (2021). https://doi.org/10.1007/s10489-020-01936-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-01936-5

Keywords

Navigation