Skip to main content
Log in

CMEFS: chaotic mapping-based mayfly optimization with fuzzy entropy for feature selection

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Swarm intelligence algorithms availably settle the issue of feature selection for classification, whereas the Mayfly Optimization Algorithm (MOA) proposed in recent years has the superiority of high precision, concise structure, and effortless enforcement; but it is still caught in local optimum, and the convergence speed needs to be optimized. Then, this work studies a new MOA and further develops a binary MOA to solve feature selection problems. Initially, based on two mapping mechanisms of Logistic-Tent and Cubic chaotic, the male and female populations of MOA are mapped respectively to enhance the variety of MOA populations in the exploration stage, and the Cubic chaotic mapping scheme is cited to dynamically disturb the global optimum to eliminate the limitation of easily getting into local optimum and promote local exploration ability for MOA. Secondly, the parameter fuzzy entropy is proposed to improve MOA, and then an adaptive function based on the parameter fuzzy entropy is constructed by using the historical optimal result of mayfly in the population of MOA. The parameter fuzzy entropy is used as an impact factor to variously adapt the inertia weight, balance global optimization and local exploration capability of population, and increase the variety of population and uniformity of distribution. Further, the contraction factor is improved to be introduced into MOA, and two learning factors are restricted by parameters, so that the velocity of mayfly individuals is not too large, and the convergence performance of MOA is effectively improved. Finally, the binary MOA is constructed based on the S-type transfer function, so that it can process those continuous data, and the optimal feature subset is selected by employing the fitness function. Simulation experimental results on 16 benchmark functions and 12 public datasets show that the binary MOA has great optimization performance, and the effectiveness of the designed feature selection algorithm has been verified.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

These eight low-dimensional datasets in Table 4 were download from http://archive.ics.uci.edu/ml/index.PHP, and four high-dimensional datasets in Table 7 were downloaded from http://portals.broadinstitute.org/cgibin/cancer/datasets.cgi.

References

  1. Zhou P, Wang X, Du L (2023) Bi-level ensemble method for unsupervised feature selection. Inform Fusion 100:101910

    Google Scholar 

  2. Sun L, Li M, Ding W, Xu J (2023) Adaptive fuzzy multi-neighborhood feature selection with hybrid sampling and its application for class-imbalanced data. Appl Soft Comput 149:110968

    Google Scholar 

  3. Yin TY, Chen HM, Wan JH, Zhang PF, Horng SJ (2024) Exoloiting feature multi-correlations for multilabel feature selection in robust multi-neighborhood fuzzy β covering space. Inform Fusion 104:102150

    Google Scholar 

  4. Sun L, Si S, Ding WP, Wang XY, Xu JC (2023) TFSFB: two-stage feature selection via fusing fuzzy multi-neighborhood rough set with binary whale optimization for imbalanced data. Inform Fusion 95:91–108

    Google Scholar 

  5. Sun ZZ, Xie H, Liu JH, Yu YL (2024) Multi-label feature selection via adaptive dual-graph optimization. Expert Syst Appl 243:122884

    Google Scholar 

  6. Sun L, Ma Y, Ding W, Xu J (2024) Sparse feature selection via local feature and high-order label correlation. Appl Intell 54(1):565–591

    Google Scholar 

  7. Xue Y, Cai X, Jia WW (2022) Particle swarm optimization based on filter-based population initialization method for feature selection in classification. J Ambient Intell Humaniz Comput 14:7355–7366

    Google Scholar 

  8. Cho P, Chang W, Song J (2019) Application of instance-based entropy fuzzy support vector machine in peer-to-peer lending investment decision. IEEE Access 7:16925–16939

    Google Scholar 

  9. Sun L, Si S, Ding W, Wang X, Xu J (2023) Multiobjective sparrow search feature selection with sparrow ranking and preference information and its applications for high-dimensional data. Appl Soft Comput 147:110837

    Google Scholar 

  10. Gao JR, Wang ZQ, Jin T, Cheng JJ, Lei ZY, Gao SC (2024) Information gain ratio-based subfeature grouping empowers particle swarm optimization for feature selection. Knowl-Based Syst 8:111380

    Google Scholar 

  11. Kang Y, Wang HN, Tao L, Yang HX, Yang XK, Wang F, Li H (2022) Hybrid improved flower pollination algorithm and gray wolf algorithm for feature selection. Comput Sci 49(6A):125–132

    Google Scholar 

  12. Sun L, Wang XY, Ding WP, Xu JC, Meng HL (2023) TSFNFS: two-stage-fuzzy-neighborhood feature selection with binary whale optimization algorithm. Int J Mach Learn Cyb 14:609–631

    Google Scholar 

  13. Fridausanti NA, Irhamah (2019) On the comparison of crazy particle swarm optimization and advanced binary ant colony optimization for feature selection on high-dimensional data. Procedia Comput Sci 161:638–646

    Google Scholar 

  14. Zervoudakis K, Tsafarakis S (2020) A mayfly optimization algorithm. Comput Ind Eng 145(7):106559

    Google Scholar 

  15. Zhou DS, Kang ZY, Su XP (2022) An enhance mayfly optimization algorithm based on orthogonal learning and chaotic exploitation strategy. Int J Mach Learn Cybern 13:3625–3643

    Google Scholar 

  16. Gao ZQ, Zhang YJ, Qiu QM, Shao JL (2022) Improved mayfly algorithm and its application in firewall policy configuration. J Shanxi Univ Technol 38(02):41–48

    Google Scholar 

  17. Zhao ML, Yang XL, Yin XY (2022) An improved mayfly algorithm and its application. AIP Adv 12(10):105320

    Google Scholar 

  18. Wang KY, Fu Q, Chen JH (2023) An improved hybrid mayfly algorithm for global optimization. J Supercomput 79:5878–5919

    Google Scholar 

  19. Trinav B, Bitanu C, Pawan KS (2020) Mayfly in harmony: a new hybrid meta–heuristic feature selection algorithm. IEEE Access 16(8):195929–195945

    Google Scholar 

  20. Faramarzi A, Heidarinejad M, Stephens B, Mirjalili S (2020) Equilibrium optimizer: a novel optimization algorithm. Knowl-Based Syst 191:105190

    Google Scholar 

  21. Chen F, Yang C, Lu J (2021) Non-invasive identification of household load based on MA-SVM. Intell Comput Appl 11(10):113–117

    Google Scholar 

  22. Wang Y, Zhang D, Zhang LN (2021) Mayfly optimization algorithm based on gold sine and adaptive merge. Appl Res Comput 38(10):3072–3077

    MathSciNet  Google Scholar 

  23. Xu HZ, Xu WQ, Kong ZM (2022) Mayfly algorithm based on tent chaotic sequence and its application. Control Eng China 29(3):435–440

    Google Scholar 

  24. Cheng R, Jin Y (2015) A competitive swarm optimizer for large scale optimization. IEEE Trans Cyb 45(2):191–204

    Google Scholar 

  25. Zohre S, Ebrahim A, Hossein N (2021) A hybrid feature selection method based on information theory and binary butterfly optimization algorithm. Eng Appl Artif Intell 97:104079

    Google Scholar 

  26. Zhang CL, Ding SF (2021) A stochastic configuration network based on chaotic sparrow search algorithm. Knowl-Based Syst 220:106924

    Google Scholar 

  27. Hussien AG, Amin M (2022) A self-adaptive Harris hawks optimization algorithm with opposition-based learning and chaotic local search strategy for global optimization and feature selection. Int J Mach Learn Cybern 13:309–336

    Google Scholar 

  28. Sayed GI, Khoriba G, Haggag MH (2022) A novel chaotic equilibrium optimizer algorithm with S-shaped and V-shaped transfer functions for feature selection. J Ambient Intell Humaniz Comput 13:3137–3162

    Google Scholar 

  29. Tutueva AV, Nepomuceno EG, Karimov AI (2020) Adaptive chaotic maps and their application to pseudo-randomnumbers generation. Chaos, Solitons Fractals 133:109615

    MathSciNet  Google Scholar 

  30. Ouyang CT, Liu YJ, Zhu DL (2021) An adaptive chaotic sparrow search algorithm. IEEE 2nd international conference on big data, artificial intelligence and internet of things engineering: 26–28

  31. Loginov SS (2019) Chaotic systems based pseudo-random signal generators realized over a galois finite field. Syst Signal Synchroniz Generat Process Telecommun 2019:1–4

    Google Scholar 

  32. Agrawal A, Tripathi S (2021) Particle swarm optimization with adaptive inertia weight based on cumulative binomial probability. Evol Intel 14:305–313

    Google Scholar 

  33. Liang QK, Chen B, Wu HN, Ma CY, Li SY (2021) A novel modified sparrow search algorithm with the application in side lobe level reduction of linear antenna array. Wireless Commun Mobile Comput:9915420

  34. Xue JK, Shen B (2020) A novel swarm intelligence optimization approach: sparrow search algorithm. Syst Sci Control Eng 8:22–34

    Google Scholar 

  35. Xue Y, Zhu H, Liang JY, Slowik A (2021) Adaptive crosser operator based multi-objective binary genetic algorithm for feature selection in classification. Knowl-Based Syst 227:107218

    Google Scholar 

  36. Clerc M (1999) The swarm and the queen: towards a deterministic and adaptive particles swarm optimizations. Proc IEEE Congress Evol Comput 1999(8):19954–11957

    Google Scholar 

  37. Wang YK, Chen XB (2020) Hybrid quantum particle swarm optimization algorithm and its application. SCIENCE CHINA Inf Sci 63(5):03–205

    MathSciNet  Google Scholar 

  38. Xue Y, Deng Y (2021) Decision making under measure-based granular uncertainty with intuitionistic fuzzy sets. Appl Intell 51:6226–6233

    Google Scholar 

  39. Ludmila D, Krzysztof K, Pavel S (2021) An approach to generalization of the intuitionistic fuzzy topsis method in the framework of evidence theory. J Artif Intell Soft Comput Res 11(2):157–175

    Google Scholar 

  40. Thaher T, Heidari AA, Mafarja M (2020) Binary Harris hawks optimizer for high-dimensional, low sample size feature selection. Evol Mach Learn Techniques 43:251–272

    Google Scholar 

  41. Gu S, Cheng R, Jin Y (2018) Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Compute 22:811–822

    Google Scholar 

  42. Zhang Y, Gong D, Cheng J (2017) Multi-objective particle swarm optimization approach for cost-based feature selection in classification. IEEE/ACM Trans Comput Biol Bioinform 14(1):64–75

    Google Scholar 

  43. Sun L, Wang TX, Ding WP (2021) Feature selection using fisher score and multilabel neighborhood rough sets for multilabel classidication. Inf Sci 578:887–912

    Google Scholar 

  44. Faramaizi A, Heidarinejad M, Mirjalili S (2020) Marine predators algorithm: a nature-inspired meta-heuristic. Expert Syst Appl 152:113377

    Google Scholar 

  45. Eskandar H, Sadollah A, Bahreininejad (2012) A water cycle algorithm-a novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput Struct 110(111):151–166

    Google Scholar 

  46. Shareef H, Ibrahim AA, Mutlag AH (2015) Lighting search algorithm. Appl Soft Comput 36:315–333

    Google Scholar 

  47. Long NC, Meesad P, Unger H (2014) Attribute reduction based on rough sets and the discrete firefly algorithm. In: Proceeding of the 10th international conference on computing and information technology. Springer, Berlin, Germany, pp 13–22

    Google Scholar 

  48. Sun L, Huang JX, Xu JC (2022) Feature selection based on adaptive whale optimization algorithm and fault-tolerance neighborhood rough sets. Pattern Recognit Artif Intell 35(2):150–165

    Google Scholar 

  49. Chen YM, Zhu QX, Xu HR (2015) Finding rough set reducts with fish swarm algorithm. Knowl-Based Syst 81:22–29

    Google Scholar 

  50. Zouache D, Abdelaziz FB (2018) A cooperative swarm intelligence algorithm based on quantum-inspired and rough sets for feature selection. Comput Ind Eng 115:26–36

    Google Scholar 

  51. Wang D, Chen HM, Li TR (2020) A novel quantum grasshopper optimization algorithm for feature selection. Int J Approx Reason 127:33–53

    MathSciNet  Google Scholar 

  52. Sun L, Si S, Zhao J, Xu JC, Lin YJ, Lv ZY (2023) Feature selection using binary monarch butterfly optimization. Appl Intell 53:706–727

    Google Scholar 

  53. Paul A, Sil J, Mukhopadhyay CD (2017) Gene selection for designing optimal fuzzy rule base classifier by estimating missing value. Appl Soft Comput 55:276–288

    Google Scholar 

  54. Zhao Z, Liu H (2009) Searching for interacting features in subset selection. Intell Data Anal 13(2):207–228

    Google Scholar 

  55. Sun L, Wang LY, Qian YH (2019) Feature selection using Lebesgue and entropy measures for incomplete neighborhood decision systems. Knowl-Based Syst 186:104942

    Google Scholar 

Download references

Acknowledgments

The authors would like to express their sincere appreciation to the anonymous reviewers for their insightful comments, which greatly improved the quality of this paper. This research was funded by the National Natural Science Foundation of China under Grants 62076089, 61976082, and 61976120; and the Natural Science Key Foundation of Jiangsu Education Department under Grant 21KJA510004.

CRediT authorship contribution statement

Lin Sun: Central idea, Analyzed most of the data, Wrote and revised this paper. Hanbo Liang: Central idea, Analyzed most of the data, Wrote and revised this paper. Weiping Ding: Revising this paper. Jiucheng Xu: Revising this paper. Baofang Chang: Revising this paper.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Lin Sun or Weiping Ding.

Ethics declarations

Competing interests

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sun, L., Liang, H., Ding, W. et al. CMEFS: chaotic mapping-based mayfly optimization with fuzzy entropy for feature selection. Appl Intell 54, 7397–7417 (2024). https://doi.org/10.1007/s10489-024-05555-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-024-05555-2

Keywords