Skip to main content
Log in

Binary Artificial Electric Field Algorithm

  • Research Paper
  • Published:
Evolutionary Intelligence Aims and scope Submit manuscript

Abstract

Artificial Electric Field Algorithm (AEFA) is one of the recent population-based optimization techniques, it is inspired by the electrostatic force theory. This article aims to design a novel binary version of the AEFA scheme to improve the performance of the original AEFA scheme in discrete space. The popular S-shaped and V-shaped functions are used to design the binary versions of the AEFA. The efficiency and the optimization ability of the proposed binary versions of the AEFA are studied theoretically as well as experimentally. An extensive experimental study is performed to understand the performance of the proposed schemes. A set of 24 benchmark problems are solved using binary versions of AEFA the experimental results are compared with nine state-of-the-art algorithms. The running time complexity and Wilcoxon’s signed-rank statistical test are also conducted to judge the proposed algorithms. In addition to the experimental studies, theoretical analysis is also carried out which suggests the convergence scenario of the proposed schemes. These studies suggest that the designed binary versions of the AEFA are very efficient and competent in addressing difficult optimization problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  1. Abdullah JM, Ahmed T (2019) Fitness dependent optimizer: inspired by the bee swarming reproductive process. IEEE Access 7:43473–43486

    Article  Google Scholar 

  2. Arora S, Anand P (2019) Binary butterfly optimization approaches for feature selection. Expert Syst Appl 116:147–160

    Article  Google Scholar 

  3. Bala I, Yadav A (2019) Gravitational search algorithm: a state-of-the-art review. In: Harmony search and nature inspired optimization algorithms. Advances in Intelligent Systems and Computing, 741. Springer, Singapore. https://doi.org/10.1007/978-981-13-0761-4_3

  4. Bala I, Yadav A (2020) Comprehensive learning gravitational search algorithm for global optimization of multimodal functions. Neural Comput Appl 32(11):7347–7382

    Article  Google Scholar 

  5. Banka H, Dara S (2015) A hamming distance based binary particle swarm optimization (hdbpso) algorithm for high dimensional feature selection, classification and validation. Pattern Recognit Lett 52:94–100

    Article  Google Scholar 

  6. Chelouah R, Siarry P (2000) Tabu search applied to global optimization. Eur J Oper Res 123(2):256–270

    Article  MathSciNet  MATH  Google Scholar 

  7. Chen W-N, Zhang J, Chung HS, Zhong W-L, Wu W-G, Shi Y-H (2009) A novel set-based particle swarm optimization method for discrete optimization problems. IEEE Trans Evol Comput 14(2):278–300

    Article  Google Scholar 

  8. Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18

    Article  Google Scholar 

  9. Dorigo M, Birattari M, Stützle T (2006) Ant colony optimization. IEEE computational intelligence magazine, 1(4):28–39

  10. Eid HF (2018) Binary whale optimisation: an effective swarm algorithm for feature selection. Int J Metaheuristics 7(1):67–79

    Article  Google Scholar 

  11. Hussien AG, Hassanien AE, Houssein EH, Amin M, Azar AT (2020) New binary whale optimization algorithm for discrete optimization problems. Eng Optim 52(6):945–959

    Article  MathSciNet  MATH  Google Scholar 

  12. Jang S-H, Roh J-H, Kim W, Sherpa T, Kim J-H, Park J-B (2011) A novel binary ant colony optimization: application to the unit commitment problem of power systems. J Electr Eng Technol 6(2):174–181

    Article  Google Scholar 

  13. Karaboga D, Basturk B (2008) On the performance of artificial bee colony (ABC) algorithm. Appl Soft Comput 8(1):687–697

    Article  Google Scholar 

  14. Kaur S, Awasthi LK, Sangal A, Dhiman G (2020) Tunicate swarm algorithm: a new bio-inspired based metaheuristic paradigm for global optimization. Eng Appl Artif Intell 90:103541

    Article  Google Scholar 

  15. Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks, vol. 4. IEEE, pp 1942–1948

  16. Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: 1997 IEEE international conference on systems, man, and cybernetics. Computational cybernetics and simulation, vol 5. IEEE, pp 4104–4108

  17. Khanesar MA, Teshnehlab M, Shoorehdeli MA (2007) A novel binary particle swarm optimization. In: 2007 Mediterranean conference on control & automation. IEEE, pp 1–6

  18. Kumar M, Husain M, Upreti N, Gupta D (2010) Genetic algorithm: review and application. Available at SSRN 3529843

  19. Liang JJ, Qin AK, Suganthan PN, Baskar S (2006) Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans Evol Comput 10(3):281–295

    Article  Google Scholar 

  20. Mafarja MM, Eleyan D, Jaber I, Hammouri A, Mirjalili S (2017) Binary dragonfly algorithm for feature selection. In: 2017 international conference on new trends in computing sciences (ICTCS). IEEE, pp 12–17

  21. Mirjalili S, Gandomi AH, Mirjalili SZ, Saremi S, Faris H, Mirjalili SM (2017) Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv Eng Softw 114:163–191

    Article  Google Scholar 

  22. Mirjalili S, Hashim SZM (2012) Bmoa: binary magnetic optimization algorithm. Int J Mach Learn Comput 2(3):204

    Article  Google Scholar 

  23. Mirjalili S, Lewis A (2013) S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Article  Google Scholar 

  24. Molga M, Smutnicki C (2005) Test functions for optimization needs. Test Funct Optim Needs 101:48

    Google Scholar 

  25. Nikolaev AG, Jacobson SH (2010) Simulated Annealing. International Series in Operations Research & Management Science, vol 146. Springer, Boston, MA. https://doi.org/10.1007/978-1-4419-1665-5_1

  26. Pampará G, Engelbrecht AP (2011) Binary artificial bee colony optimization. In: 2011 IEEE symposium on swarm intelligence. IEEE, pp 1–8

  27. Połap D, Kesik K, Woźniak M, Damaševičius R (2018) Parallel technique for the metaheuristic algorithms using devoted local search and manipulating the solutions space. Appl Sci 8(2):293

    Article  Google Scholar 

  28. Połap D, Woźniak M (2021) Red fox optimization algorithm. Expert Syst Appl 166:114107

    Article  Google Scholar 

  29. Qin AK, Huang VL, Suganthan PN (2008) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2):398–417

    Article  Google Scholar 

  30. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2009) Gsa: a gravitational search algorithm. Inf Sci 179(13):2232–2248

    Article  MATH  Google Scholar 

  31. Rashedi E, Nezamabadi-Pour H, Saryazdi S (2010) Bgsa: binary gravitational search algorithm. Nat Comput 9(3):727–745

    Article  MathSciNet  MATH  Google Scholar 

  32. Reddy KS, Panwar L, Panigrahi B, Kumar R (2019) Binary whale optimization algorithm: a new metaheuristic approach for profit-based unit commitment problems in competitive electricity markets. Eng Optim 51(3):369–389

    Article  MathSciNet  MATH  Google Scholar 

  33. Rizk-Allah RM, Hassanien AE, Elhoseny M, Gunasekaran M (2019) A new binary salp swarm algorithm: development and application for optimization tasks. Neural Comput Appl 31(5):1641–1663

    Article  Google Scholar 

  34. Santana CJ Jr, Macedo M, Siqueira H, Gokhale A, Bastos-Filho CJ (2019) A novel binary artificial bee colony algorithm. Future Gener Comput Syst 98:180–196

    Article  Google Scholar 

  35. Saremi S, Mirjalili S, Lewis A (2015) How important is a transfer function in discrete heuristic algorithms. Neural Comput Appl 26(3):625–640

    Article  Google Scholar 

  36. Shekhawat SS, Sharma H, Kumar S, Nayyar A, Qureshi B (2021) bssa: Binary salp swarm algorithm with hybrid data transformation for feature selection. IEEE Access 9:14867–14882

    Article  Google Scholar 

  37. Wilcoxon F (1992) Individual Comparisons by Ranking Methods. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-4380-9_16

  38. Wu G, Mallipeddi R, Suganthan PN (2017) Problem definitions and evaluation criteria for the cec 2017 competition on constrained real-parameter optimization. National University of Defense Technology, Changsha, Hunan, PR China and Kyungpook National University, Daegu, South Korea and Nanyang Technological University, Singapore, Technical report

  39. Yadav A, Deep K (2013) Shrinking hypersphere based trajectory of particles in PSO. Appl Math Comput 220:246–267

    MATH  Google Scholar 

  40. Yadav A, Deep K (2014) An efficient co-swarm particle swarm optimization for non-linear constrained optimization. J Comput Sci 5(2):258–268

    Article  Google Scholar 

  41. Yadav A, Deep K, Kim JH, Nagar AK (2016) Gravitational swarm optimizer for global optimization. Swarm Evol Comput 31:64–89

    Article  Google Scholar 

  42. Yadav A et al (2019) Aefa: artificial electric field algorithm for global optimization. Swarm Evol Comput 48:93–108

    Article  Google Scholar 

  43. Yang X-S, He X (2013) Firefly algorithm: recent advances and applications. Int J Swarm Intell 1(1):36–50

    Article  Google Scholar 

  44. Zhang D-Y, Liu J-H, Jiang L, Bu G-N, Hu R-Y, Luo Y-X (2020) The improvement of v-shaped transfer function of binary particle swarm optimization. In: International conference on swarm intelligence. Springer, pp 202–211

Download references

Acknowledgements

The authors are thankful to Dr B. R. Ambedkar National Institute of Technology Jalandhar for the necessary support to this research. The first author is thankful to the Ministry of Education Government of India for providing financial support to carry out this work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anupam Yadav.

Ethics declarations

Conflict of interest

All the authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (xlsx 436 KB)

A Appendix

A Appendix

1.1 Illustration of binary versions of the AEFA

To illustrate implementing the procedure of binary versions of the AEFA, an illustrative example is presented for the sphere function. The mathematical expression of sphere function is mentioned in Eq. (41).

$$\begin{aligned} F(X)=\sum _{i=1}^{D}X^{2} \end{aligned}$$
(41)

For the sake of convenience and easy to understand the population size \(N=3\) and dimension \(D=8\) are considered. \(vel_{i}^{d}\) and \(X_{i}^{d}\), \(i=1,2,3; d=1,2,3,4,5,6,7,8\) are representing the velocity and position of the candidate solution.

  1. Iteration.1.

    Initialization of position and velocity The initial position of candidate solution are randomly initialized between \([-100,100]\)

    $$\begin{aligned}&X_{1}=[65.4169, -15.9417, 12.3398, -72.7920, 29.2953, \nonumber \\&86.7142, 98.6398, 18.4448],\quad F(X_{1})= 2.8432e+04\nonumber \\&X_{2}=[61.2647, 67.7288, -4.6856, -15.3621, -34.6034, \nonumber \\&6.1326, 38.8467, 1.3461],\quad F(X_{2})=1.1344e+04\nonumber \\&X_{3}=[ 76.2411, -82.2269, 48.7849, 9.1527, 20.5085,\nonumber \\&70.3099, 96.2467, 23.2951],\quad F(X_{3})= 3.0208e+04 \end{aligned}$$

    The initialization of velocity is randomly chosen

    $$\begin{aligned} vel_{1}&=[ 0.0155, 0.2967, -0.0296, 0.6658, \nonumber \\&-0.0948, -0.0644, 0.4611, 0.3427]\nonumber \\ vel_{2}&=[ 0.2253, 0.6273, 0.0912, 0.0932, \nonumber \\&0.1454, 0.2951, 0.5124, 0.1867]\nonumber \\ vel_{3}&=[ 0.1771, 0.6126, -0.1322, -0.1151, \nonumber \\&-0.1293, -0.0313, 0.1487, -0.0340]\nonumber \end{aligned}$$

    Calculate the probability values by using transfer function for \(vel_{1}\), \(vel_{2}\) and \(vel_{3}\), we get

    $$\begin{aligned} S_{1}=[0.4961,0.4439,0.4558] \end{aligned}$$
    (42)

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 2.8432e+04, 1.1344e+04 and 3.0208e+04 for iteration one. The best fitness value for the first iteration is 1.1344e+04 that is the best value of 2nd agent. In the next iteration, the values of position switch 0 or 1 according to the random variable that lies within [0, 1] and the probability value of \(S_{1}\). If the value of the random variable is lesser than or equal to the \(S_{1}\) value then position changes their values by 1 otherwise it changes their values by 0.

  2. Iteration.2.
    $$\begin{aligned} X_{1}&=[1,1,0,0,0,0,0,0],\quad F(X_{1})= 2\nonumber \\ X_{2}&=[1,0,0,0,0,0,0,0],\quad F(X_{2})=1\nonumber \\ X_{3}&=[0,1,0,0,0,0,0,0],\quad F(X_{3})= 1\nonumber \end{aligned}$$
    $$\begin{aligned} vel_{1}&=[ -0.0001, 0.1115, -0.0223, 0.2375, \nonumber \\&\quad -0.0588, -0.0230, 0.0397, 0.1105]\nonumber \\ vel_{2}&=[ 0.1610, 0.2564, 0.0356, 0.0784, \nonumber \\&\quad 0.0915, 0.0627, 0.0169, 0.1238]\nonumber \\ vel_{3}&=[0.1078, 0.2033, -0.0998, -0.0720, \nonumber \\&\quad -0.0254 , -0.0011 , 0.1399, -0.0047 ]\nonumber \\\ S_{1}&=[0.5000,0.4598,0.4730] \end{aligned}$$
    (43)

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 2, 1 and 1 for iteration two. The best fitness value for the second iteration is 1 that is the best value of 2nd and 3rd agents.

  3. Iteration.3.
    $$\begin{aligned} X_{1}&=[1,0,0,1,0,0,0,0],\quad F(X_{1})= 2\nonumber \\ X_{2}&=[1,0,0,0,0,0,0,0],\quad F(X_{2})=1\nonumber \\ X_{3}&=[0,0,0,0,0,0,0,0],\quad F(X_{3})= 0\nonumber \\ vel_{1}&=[ -0.0001, 0.0510, -0.0140, 0.1306, \nonumber \\&\quad -0.0467, -0.0053, 0.0132, 0.0180]\nonumber \\ vel_{2}&=[ 0.1107, 0.1418, 0.0268, 0.0563, \nonumber \\&\quad 0.0560, 0.0300, 0.0101, 0.0913]\nonumber \\ vel_{3}&=[ 0.0229, 0.0500 , -0.0037 , -0.0227 ,\nonumber \\&\quad -0.0089 , -0.0002 , 0.0651 , -0.0030 ]\nonumber \\ S_{1}&=[0.5000,0.4723,0.4942] \end{aligned}$$
    (44)

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 2, 1 and 0 for iteration three. The best fitness value for the third iteration is 0 that is the best value of 3rd agent.

  4. Iteration.4.
    $$\begin{aligned} X_{1}&=[0,1,0,0,0,0,0,0],\quad F(X_{1})= 1\nonumber \\ X_{2}&=[0,0,0,0,0,0,0,0],\quad F(X_{2})=0\nonumber \\ X_{3}&=[0,0,0,0,0,0,0,0],\quad F(X_{3})= 0\nonumber \\ vel_{1}&=[ -0.0001, 0.0478, -0.0112, 0.0557, \nonumber \\&\quad -0.0296, -0.0023, 0.0009, 0.0051]\nonumber \\ vel_{2}&=[0.0554, 0.0107, 0.0133, 0.0032, \nonumber \\&\quad 0.0393, 0.0087, 0.0023, 0.0565]\nonumber \\ vel_{3}&=[0.0192, 0.0204, -0.0011, -0.0208, \nonumber \\&\quad -0.0067, -0.0002, 0.0509, -0.0012]\nonumber \\ S_{1}&=[0.5000,0.4861,0.4952] \end{aligned}$$
    (45)

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 1, 0 and 0 for iteration four. The best fitness value for the fourth iteration is 0 that is the best value of 2nd and 3rd agents.

  5. Iteration.5.
    $$\begin{aligned} X_{1}&=[1,0,0,0,0,0,0,0],\quad F(X_{1})= 1\nonumber \\ X_{2}&=[0,0,0,0,0,0,0,0],\quad F(X_{2})=0\nonumber \\ X_{3}&=[0,0,0,0,0,0,0,0],\quad F(X_{3})= 0\nonumber \\ vel_{1}&=[ -0.0000, 0.0033, -0.0107, 0.0095,\nonumber \\&\quad -0.0294, -0.0015, 0.0005, 0.0018]\nonumber \\ vel_{2}&=[ 0.0433, 0.0069, 0.0123, 0.0008, \nonumber \\&\quad 0.0366, 0.0085, 0.0015, 0.0249]\nonumber \\ vel_{3}&=[0.0047, 0.0075, -0.0003, -0.0047, \nonumber \\&\quad -0.0006, -0.0001, 0.0390, -0.0000]\nonumber \\ S_{1}&=[0.5000,0.4891,0.9882] \end{aligned}$$
    (46)

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 1, 0 and 0 for iteration five. The best fitness value for the fifth iteration is 0 that is the best value of 2nd and 3rd agents.

  6. Iteration.6.
    $$\begin{aligned} X_{1}&=[0,0,0,0,0,0,0,0],\quad F(X_{1})= 0\nonumber \\ X_{2}&=[0,0,0,0,0,0,0,0],\quad F(X_{2})=0\nonumber \\ X_{3}&=[0,0,0,0,0,0,0,0],\quad F(X_{3})= 0\nonumber \end{aligned}$$

    The fitness values of all agents \(X_{1}\), \(X_{2}\) and \(X_{3}\) are 0, 0 and 0 for iteration six. The best fitness value for the sixth iteration is 1 that is the best value of all three agents. Since all agent has a fitness value 0 so no improvement is needed.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chauhan, D., Yadav, A. Binary Artificial Electric Field Algorithm. Evol. Intel. 16, 1155–1183 (2023). https://doi.org/10.1007/s12065-022-00726-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12065-022-00726-x

Keywords

Navigation