Skip to main content
Log in

Opposition-based binary competitive optimization algorithm using time-varying V-shape transfer function for feature selection

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

The feature selection problem is known as one of the most critical issues with several applications in classification. Certain features of a dataset do not usually contain useful information or play important roles in the data classification. It is possible to reduce the computational burden through the elimination of unnecessary features. This paper was conducted to propose an opposition-based binary competitive optimization algorithm, namely OBCOOA, to solve the wrapper-based feature selection problems. The basic idea of the competitive optimization algorithm (COOA) lies on the natural competition between different species to survive, which is a continuous algorithm inherently. The present study entailed the main contributions as follows: primarily, a time-varying V-shape transfer function was used in the optimization process to present the binary version of the COOA algorithm, which established the right balance between the exploration and exploitation phases; secondly, in the proposed algorithm, the opposition-based learning mechanism was utilized to improve the diversity quality within the population members and to incorporate a suitable initial population. In the feature selection problem, the classification error rate and the number of selected features objective functions are usually adopted, which causes this problem to convert into a multi-objective optimization one. This research presented a single- and multi-objective approach from the proposed algorithm in order to solve the feature selection problem. The proposed algorithm was applied to 27 benchmark datasets, and the evaluation results were compared to the other well-known binary optimization algorithms. The experimental results indicated that the proposed algorithm has a better ability to find an optimal subset of features with the least classification error rate and the number of the selected features.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31

Similar content being viewed by others

References

  1. Charkhgard H, Savelsbergh M, Talebian M (2018) A linear programming based algorithm to solve a class of optimization problems with a multi-linear objective function and affine constraints. Comput Oper Res 89(1):17–30

    MathSciNet  MATH  Google Scholar 

  2. Martín-Moreno R, Vega-Rodríguez MA (2018) Multi-Objective Artificial Bee Colony algorithm applied to the bi-objective orienteering problem. Knowl-Based Syst 154:93–101

    Google Scholar 

  3. Goldberg D (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley

    MATH  Google Scholar 

  4. Wang D, Tan D, Liu L (2018) Particle swarm optimization algorithm: an overview. Soft Comput 22(2):387–408

    Google Scholar 

  5. Opara KR, Arabas J (2019) Differential evolution: a survey of theoretical analyses. Swarm Evol Comput 44:546–558

    Google Scholar 

  6. Tizhoosh HR (2005) Opposition-based learning: a new scheme for machine intelligence. In: Proceedings of international conference on computational intelligence for modelling, control and automation, CIMCA 2005 and international conference on intelligent agents, Web Technologies and Internet, vol 1, pp 695–701

  7. Kar BP, Nayak SK, Nayak SC (2016) Opposition-based ga learning of artificial neural networks for financial time series forecasting. Comput Intell Data Min 2:405–414

    Google Scholar 

  8. Kang L, Chen RS, Cao W, Chen YC (2020) Non-inertial opposition-based particle swarm optimization and its theoretical analysis for deep learning applications. Appl Soft Comput 88:106038

    Google Scholar 

  9. Li X, Yin M (2013) An opposition-based differential evolution algorithm for permutation flow shop scheduling based on diversity measure. Adv Eng Softw 55:10–31

    Google Scholar 

  10. Dhargupta S, Ghosh M, Mirjalili S, Sarkar R (2020) Selective opposition based grey wolf optimization. Expert Syst Appl 151:113389

    Google Scholar 

  11. Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28

    Google Scholar 

  12. Agrawal RK, Kaur B, Sharma S (2020) Quantum based Whale Optimization Algorithm for wrapper feature selection. Apply Soft Comput 89:106092

    Google Scholar 

  13. Tubishat M, Idris N, Shuib L, Abushariah MA, Mirjalili S (2020) Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Syst Appl 145:113122

    Google Scholar 

  14. Ghosh A, Datta A, Ghosh S (2013) Self-adaptive differential evolution for feature selection in hyperspectral image data. Appl Soft Comput 13(4):1969–1977

    Google Scholar 

  15. Qiu C (2019) A novel multi-swarm particle swarm optimization for feature selection. Genet Program Evolvable Mach 20(4):503–529

    Google Scholar 

  16. Lazar C, Taminau J, Meganck S, Steenhoff D, Coletta A, Molter C, Schaetzen V, Duque R, Bersini H, Nowe A (2012) A survey on filter techniques for feature selection in gene expression microarray analysis. IEEE/ACM Trans Comput Biol Bioinf 9(4):1106–1119

    Google Scholar 

  17. Zarshenas A, Suzuki K (2016) Binary coordinate ascent: An efficient optimization technique for feature subset selection for machine learning. Knowl-Based Syst 110:191–201

    Google Scholar 

  18. Apolloni J, Leguizamón G, Alba E (2016) Two hybrid wrapper-filter feature selection algorithms applied to high-dimensional microarray experiments. Appl Soft Comput 38:922–932

    Google Scholar 

  19. Moslehi F, Haeri A (2019) A novel hybrid wrapper–filter approach based on genetic algorithm, particle swarm optimization for feature subset selection. J Ambient Intell Hum Comput. https://doi.org/10.1007/s12652-019-01570-1

    Article  Google Scholar 

  20. Sharma H, Kumar S (2016) A survey on decision tree algorithms of classification in data mining. Int J Sci Res (IJSR) 5(4):2094–2097

    MathSciNet  Google Scholar 

  21. Cheng L, Wang Y, Liu X, Li B (2020) Outlier detection ensemble with embedded feature selection

  22. Guyon I, Elisseeff A (2003) An introduction to variable and attribute selection. Mach Learn Res 3:1157–1182

    MATH  Google Scholar 

  23. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. Evolut Comput IEEE Trans 1:67–82

    Google Scholar 

  24. Zhang Y, Gong DW (2020) Binary differential evolution with self-learning for multi-objective feature selection. Inf Sci 507:67–85

    MathSciNet  MATH  Google Scholar 

  25. Guha R, Ghosh M, Chakrabarti A, Sarkar R, Mirjalili S (2020) Introducing clustering based population in binary gravitational search algorithm for feature selection. Appl Soft Comput 93:106341

    Google Scholar 

  26. Hu P, Pan JS, Chu SC (2020) Improved binary grey wolf optimizer and its application for feature selection. Knowl Based Syst 195:105746

    Google Scholar 

  27. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82

    Google Scholar 

  28. Sharafi Y, Khanesar MA, Teshnehlab M (2016) COOA: competitive optimization algorithm. Swarm Evol Comput 30:39–63

    Google Scholar 

  29. Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: 1997 IEEE international conference on systems, man, and cybernetics, computational cybernetics and simulation, IEEE, pp 4104–4108

  30. Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381

    Google Scholar 

  31. Sharafi Y, Khanesar MA, Teshnehlab M (2013) Discrete binary cat swarm optimization algorithm. In: 2013 3rd IEEE international conference on computer, control and communication (IC4), pp 1–6

  32. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2010) BGSA: binary gravitational search algorithm. Nat Comput 9(3):727–745

    MathSciNet  MATH  Google Scholar 

  33. Mirjalili S, Mirjalili SM, Yang XS (2014) Binary bat algorithm. Neural Comput Appl 25(3–4):663–681

    Google Scholar 

  34. Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073

    MathSciNet  Google Scholar 

  35. Mafarja M, Aljarah I, Heidari AA, Faris H, Fournier-Viger P, Li X, Mirjalili S (2018) Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowl-Based Syst 161:185–204

    Google Scholar 

  36. Beheshti Z (2020) A time-varying mirrored S-shaped transfer function for binary particle swarm optimization. Inf Sci 512:1503–1542

    MathSciNet  Google Scholar 

  37. Xia X, Gui L, Zhan ZH (2018) A multi-swarm particle swarm optimization algorithm based on dynamical topology and purposeful detecting. Appl Soft Comput 67:126–140

    Google Scholar 

  38. Gaoa W, Huanga L, Liub S, Chanc FTS, Daid C, Shana X (2015) Artificial bee colony algorithm with multiple search strategies. Appl Math Comput 271:269–287

    MathSciNet  Google Scholar 

  39. Chu SC, Tsai PW, Pan JS (2006) Cat swarm optimization. Springer, Berlin, pp 854–858

    Google Scholar 

  40. Socha K, Dorigo M (2008) Ant colony optimization for continuous domains. Eur J Oper Res 185(3):1155–1173

    MathSciNet  MATH  Google Scholar 

  41. De Jong KA (1975) Analysis of the behavior of a class of genetic adaptive systems

  42. Nicoara ES (2009) Mechanisms to avoid the premature convergence of genetic algorithms. Petrol Gas Univ Ploiesti Bull Math Inf Phys Ser 61:87–96

    Google Scholar 

  43. Choubey NS, Kharat MU (2012) Hybrid system for handling premature convergence in GA-case of grammar induction. Appl Soft Comput 13:2923–2931

    Google Scholar 

  44. Pluhacek M, Senkerik R, Davendra D, Oplatkova ZK, Zelinka I (2013) On the behavior and performance of chaos driven PSO algorithm with inertia weight. Comput Math Appl 66(2):122–134

    MathSciNet  Google Scholar 

  45. Rojas-Morales N, Rojas MCR, Ureta EM (2017) A survey and classification of opposition-based metaheuristics. Comput Ind Eng 110:424–435

    Google Scholar 

  46. Mahdavi S, Rahnamayan S, Deb K (2018) Opposition based learning: a literature review. Swarm Evol Comput 39:1–23

    Google Scholar 

  47. Seif Z, Ahmadi MB (2015) Opposition versus randomness in binary spaces. Appl Soft Comput 27:28–37

    Google Scholar 

  48. Ergezer M, Simon D (2015) Probabilistic properties of fitness-based quasi-reflection in evolutionary algorithms. Comput Oper Res 63:114–124

    MathSciNet  MATH  Google Scholar 

  49. Ergezer M, Simon D, Du D (2009) Oppositional biogeography-based optimization. In” 2009 IEEE international conference on systems, man and cybernetics, pp 1009–1014

  50. Zhao XS, Bao LL, Ning Q, Ji JC, Zhao XW (2018) An improved binary differential evolution algorithm for feature selection in molecular signatures. Mol Inf 37(4):1700081

    Google Scholar 

  51. Kashef S, Nezamabadi-pour H (2013) A new feature selection algorithm based on binary ant colony optimization. In: The 5th conference on information and knowledge technology, pp 50–54

  52. Lu J, Zhao T, Zhang Y (2008) Feature selection based-on genetic algorithm for image annotation. Knowl-Based Syst 21(8):887–891

    Google Scholar 

  53. Zhang Y, Gong DW, Gao XZ, Tian T, Sun XY (2020) Binary differential evolution with self-learning for multi-objective feature selection. Inf Sci 507:67–85

    MathSciNet  MATH  Google Scholar 

  54. Annavarapu CSR, Dara S, Banka H (2016) Cancer microarray data feature selection using multi-objective binary particle swarm optimization algorithm. EXCLI J 15:460–473

    Google Scholar 

  55. Zhang Q, Li H (2007) MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput 11(6):712–731

    Google Scholar 

  56. Soyel H, Tekguc U, Demirel H (2011) Application of NSGA-II to feature selection for facial expression recognition. Comput Electr Eng 37(6):1232–1240

    Google Scholar 

  57. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 31(1):3–18

    Google Scholar 

  58. Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Google Scholar 

  59. Jordehi AR (2019) Binary particle swarm optimisation with quadratic transfer function: a new binary optimisation algorithm for optimal scheduling of appliances in smart homes. Appl Soft Comput 78:465–480

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mohammad Teshnehlab.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sharafi, Y., Teshnehlab, M. Opposition-based binary competitive optimization algorithm using time-varying V-shape transfer function for feature selection. Neural Comput & Applic 33, 17497–17533 (2021). https://doi.org/10.1007/s00521-021-06340-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06340-9

Keywords

Navigation