Skip to main content
Log in

ssFPA/DE: an efficient hybrid differential evolution–flower pollination algorithm based approach

  • Original Article
  • Published:
International Journal of System Assurance Engineering and Management Aims and scope Submit manuscript

Abstract

Evolutionary algorithm is a field of great interest to many researchers around the world. New algorithms are developed based on biological processes that exist in nature. In addition, different variants of the existing algorithms are also created with researchers working to find the most optimal method. This paper initially introduces Differential Evolution (DE) and Flower Pollination Algorithm (FPA). Subsequently, a description of the hybrid algorithm named ssFPA/DE that uses the search strategy of FPA and DE are explained along with their results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  • Abbass HA, Sarker R, Newton C (2001) PDE: a Pareto-frontier differential evolution approach for multi-objective optimization problems. In: Evolutionary Computation, 2001. Proceedings of the 2001 Congress on (vol 2, pp 971–978)

  • Abdel-Raouf O, Abdel-Baset M (2014) A new hybrid flower pollination algorithm for solving constrained global optimization problems. Int J Appl Operational Res Open Access J 4(2):1–13

    Google Scholar 

  • Abdel-Raouf O, Abdel-Baset M, El-henawy I (2014) An improved flower pollination algorithm with chaos. Int J Educ Manag Eng (IJEME) 4(2):1–8

    Article  Google Scholar 

  • Abraham A, Das S, Konar A (2006) Document clustering using differential evolution. In: IEEE Congress on Evolutionary Computation, CEC 2006 (pp 1784–1791). IEEE

  • Abraham A, Das S, Roy S (2008) Swarm intelligence algorithms for data clustering. In: Soft computing for knowledge discovery and data mining (pp 279–313). Springer, US

  • Alam S, Tawseef M, Khan F, Fattah AA, Kabir MR (2016) Differential evolution with alternating strategies: a novel algorithm for numeric function optimization. Communications on applied electronics (CAE)—ISSN: 2394-4714 Foundation of computer science FCS (vol 4, no 2, pp 12–16). New York

  • Cai Y, Du J (2014) Enhanced differential evolution with adaptive direction information. IEEE Congr Evol Comput (CEC) 2014:305–312

    Google Scholar 

  • Cai Y, Wang J, Chen Y, Wang T, Tian H, Luo W( 2016) Adaptive direction information in differential evolution for numerical optimization. Soft Computing 20(2):465–494

  • Chakraborty UK, Das S, Konar A (2006) Differential evolution with local neighbourhood. In: 2006 IEEE International Congress on Evolutionary Computation (pp 2042–2049). IEEE

  • Das S, Suganthan PN (2011) Differential evolution: a survey of the state-of-the-art. IEEE Trans Evol Comput 15(1):4–31

    Article  Google Scholar 

  • Das S, Abraham A,  Konar A (2008) Automatic clustering using an improved differential evolution algorithm. IEEE Trans Syst, Man Cybern-Part A Syst Hum 38(1):218–237

  • Das S, Abraham A, Chakraborty UK, Konar A (2009) Differential evolution using a neighborhood based mutation operator. IEEE Trans Evol Comput 13(3):526–553

    Article  Google Scholar 

  • Fan H-Y, Lampinen J (2003) A trigonometric mutation operation to differential evolution. J Globe Optim 27(1):105–129

    Article  MathSciNet  MATH  Google Scholar 

  • Gong W, Cai Z, Ling CX (2010) DE/BBO: a hybrid differential evolution with biogeography based optimization for global numerical optimization. Soft Comput 15(4):645–665

    Article  Google Scholar 

  • Guo SM, Yang CC (2015) Enhancing differential evolution utilizing eigenvector-based crossover operator. IEEE Trans Evol Comput 19(1):31–49

    Article  MathSciNet  Google Scholar 

  • Guo Z, Liu G, Li D, Wang S (2016) Self-adaptive differential evolution with global neighbourhood search. Soft Comput 1–10

  • Iorio AW, Li X (2004) Solving rotated multi-objective optimization problems using differential evolution. In: Australasian Joint Conference on Artificial Intelligence 2004 Dec 4 (pp 861–872). Springer Berlin Heidelberg

  • Islam SM, Das S, Ghosh S, Roy S, Suganthan PN (2012) An adaptive differential evolution algorithm with novel mutation and crossover strategies for global numerical optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 42(2):482–500

  • Kaur G, Singh D (2012) Pollination Based Optimization For Color Image Segmentation. Int J Comput Eng Tech 3(2):407–414

  • Kaur G, Singh D, Kaur M (2013a) Robust and efficient ‘RGB’ based fractal image compression: flower pollination based optimization. Proc Int J Comput Appl 78(10):11–15

    Google Scholar 

  • Kaur G, Singh D, Kaur G  (2013b) 'RGB' Color Image Quantization using Pollination based Optimization. Int J Comput Appl 78(9):18–22

  • Nguyen TT, Shieh CS, Horng MF, Dao TK, Ngo TG (2015) Parallelized flower pollination algorithm with a communication strategy. In: Knowledge and Systems Engineering (KSE), 2015 Seventh International Conference on (pp 103–107). IEEE

  • Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1):107–125

    Article  Google Scholar 

  • Pahner U, Hameyer K (2000) Adaptive coupling of differential evolution and multiquadrics approximation for the tuning of the optimization process. IEEE Trans Magn 36(4):1047–1051

    Article  Google Scholar 

  • Qin AK, Huang VL, Suganthan PN (2009) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2):398–417

    Article  Google Scholar 

  • Qu BY, Suganthan PN, Liang JJ (2012) Differential evolution with neighbourhood mutation for multimodal optimization. IEEE Trans Evol Comput 16(5):601–614

    Article  Google Scholar 

  • Reed HM, Nichols JM, Earls CJ (2013) A modified differential evolution algorithm for damage identification in submerged shell structures. Mech Syst Signal Process 39:396–408

    Article  Google Scholar 

  • Robič T, Filipič B (2005) DEMO: differential evolution for multiobjective optimization. In: Evolutionary multi-criterion optimization (pp 520-533). Springer Berlin Heidelberg

  • Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359

    Article  MathSciNet  MATH  Google Scholar 

  • Wang R, Zhou Y (2014) Flower pollination algorithm with dimension by dimension improvement. Math Probl Eng 1–9

  • Wang Y, Cai Z, Zhang Q (2011) Differential evolution with composite trial vector generation strategies and control parameters. IEEE Trans Evol Comput 15(1):55–66

    Article  Google Scholar 

  • Xue F, Sanderson AC, Graves RJ (2003) Pareto-based multi-objective differential evolution. In: Evolutionary Computation, 2003. CEC’03. The 2003 Congress on 2003 (vol 2, pp 862–869). IEEE

  • Yang XS (2012) September. Flower pollination algorithm for global optimization. In: International conference on unconventional computing and natural computation, Springer, Berlin Heidelberg, pp 240–249

  • Zaharie D (2002) Critical values for the control parameters of differential evolution algorithms. In: Proceedings of MENDEL, 8th International Mendel Conference on Soft Computing, Bruno, pp 62–67

  • Zhang J, Sanderson AC (2007) An approximate Guassian model of differential evolution with spherical fitness functions. In: Proceedings of the IEEE Congress Evolution Computation Singapore, pp 2220–2228

  • Zhang J, Sanderson AC (2009) JADE: adaptive differential evolution with optional external archive. IEEE Trans Evol Comput 13(5):945–958

    Article  Google Scholar 

  • Zhou Y, Wang R, Luo Q (2015) Elite opposition-based flower pollination algorithm. Neurocomputing (in press)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Meera Ramadas.

Appendix

Appendix

Benchmark functions used: Global optimization approaches should be verified with the benchmark functions or problems. A wide range of test functions are designed to signify the different parts of global optimization algorithm. They can be extended to arbitrary dimensionality to allow scaled testing. Unimodal and multimodal functions have been used in testing the algorithm. The various benchmark functions are discussed in detail:

  1. 1.

    Sphere function (f1)

The Sphere function has d local minima except for the global one. It is continuous, convex and unimodal. The maximum and minimum range is between (−5.12, 5.12).The equation is given as:

$$f(x) = \mathop \sum \limits_{i = 1}^{d} x_{i}^{2}$$
  1. 2.

    Beale function (f2)

The Beale function is multimodal, with sharp peaks at the corners of the input domain. The maximum and minimum range is between (−4.5, 4.5).

$$f(x) = (1.5 - x_{1} + x_{1} x_{2} )^{2} + (2.25 - x_{1} + x_{1} x_{2}^{2} )^{2} + (2.625 - x_{1} + x_{1} x_{2}^{3} )^{2}$$
  1. 3.

    Booth function (f3)

The function is usually evaluated on the square xi ∈ [−10, 10], for all i = 1, 2. There are several local minima for this function. So it is a multimodal function.

$$f(x) = ( x_{1} + 2x_{2} - 7)^{2} + ( 2x_{1} + x_{2} - 5)^{2}$$
  1. 4.

    Schwefel function (f4)

The Schwefel function is complex, with many local minima. The plot shows the two-dimensional form of the function. The function is usually evaluated on the hypercube xi ∈ [−500, 500], for all \(i = 1, \ldots ,d\).

$$f(x) = 418.9829 d - \mathop \sum \limits_{i = 1}^{d} x_{i} \sin \left( {\sqrt {x_{i} } } \right)$$
  1. 5.

    Michalewicz function (f5)

The Michalewicz function has local minima, and it is multimodal. The parameter m defines the steepness of t valleys and ridges; a larger m leads to a more difficult search. The recommended value of m is m = 10.The maximum and minimum range is between (0, π).

$$f(x) = - \mathop \sum \limits_{i = 1}^{d} \sin (x_{i} )\sin^{2m} \left( {\frac{{i x_{i}^{2} }}{\pi }} \right)$$
  1. 6.

    Schaffner function N.2 (f6)

The second Schaffer function. It is shown on a smaller input domain in the second plot to show detail. The function is usually evaluated on the square xi ∈ [−100, 100], for all i = 1, 2.

$$f(x,y) = 0.5 + \frac{{\sin^{2} (x^{2} - y^{2} ) - 0.5}}{{(1 + 0.001(x^{2} + y^{2} ))^{2} }}$$
  1. 7.

    Schaffner function N.4 (f7)

The fourth Schaffer function. It is shown on a smaller input domain in the second plot to show detail. The function is usually evaluated on the square xi ∈ [−100, 100], for all i = 1, 2.

$$f(x,y) = 0.5 + \frac{{\cos^{2} (\sin^{2} (\left| {x^{2} - y^{2} } \right|)) - 0.5}}{{(1 + 0.001(x^{2} + y^{2} ))^{2} }}$$
  1. 8.

    HimmelBlau function (f8)

It is a multimodal function used to solve optimization problems. The function is evaluated on xi ∈ [−5, 5], for all \(i = 1, \ldots ,d\).

$$f(x,y) = (x^{2} + y - 11)^{2} + (y^{2} + x - 7)^{2}$$
  1. 9.

    Bird function (f9)

This is a bi-modal function with f(x*) 106.764537 in the search domain [2, 2]. The maximum and minimum range is given as (−2π, 2π).

$$f(x,y) = \sin (x)e^{{(1 - \cos (y))^{2} }} + \cos (y)e^{{(1 - \sin (x))^{2} }} + (x - y)^{2}$$
  1. 10.

    Extended cube function (f10)

This is a multimodal minimization problem for global optimization. Here, n represents the number of dimensions and the maximum and minimum range is between (−100, 100).

$$f(x) = \sum\limits_{i = 1}^{n} {100(x_{i + 1} - x_{i}^{3} )^{2} + (1 - x_{i} )^{2} }$$
  1. 11.

    Ackley function (f11)

This function is used mainly to test optimisation algorithms. The equation for this function is as given below:

$$f(x) = - a{\kern 1pt} \exp {\kern 1pt} \left( { - b\sqrt {\frac{1}{d}\sum\limits_{i = 1}^{d} {x_{i}^{2} } } } \right) - {\kern 1pt} \exp \left( {\frac{1}{d}\sum\limits_{i = 1}^{d} {\cos (cx_{i} )} } \right) + a + \exp (1){\kern 1pt}$$

The function is a risk for many optimisation problems to get trapped in one of its many local minima. The maximum and minimum range is between (−32, 32).

  1. 12.

    Goldstein-Price function (f12)

The Goldstein-Price function has several local minima. The function is usually evaluated on the square xi ∈ [-2, 2], for all i = 1, 2. The maximum and minimum range is between (−2, 2).

$$f(x) = (1 + (x + y + 1)^{2} (19 - 14x + 3x^{2} - 14y + 6xy + 3y^{2} ))(30 + (2x - 3y)^{2} (18 - 32x + 12x^{2} + 48y - 36xy + 27y^{2} ))$$
  1. 13.

    Griewank function (f13)

The Griewank function has many widespread local minima, which are regularly distributed. The maximum and the minimum range is between (−600, 600).

$$f(x) = \mathop \sum \limits_{i = 1}^{d} \frac{{x_{i}^{2} }}{4000} - \mathop \prod \limits_{i = 1}^{d} \cos \left( {\frac{{x_{i} }}{\sqrt i }} \right) + 1$$
  1. 14.

    Rastrigen function (f14)

The Rastrigin function has several local minima. It is highly multimodal, but locations of the minima are regularly distributed. The maximum and minimum range is between (−15, 15).

$$f(x) = 10d + \mathop \sum \limits_{i = 1}^{d} [x_{i}^{2} - 10 \cos ( 2 \pi x_{i} )]$$
  1. 15.

    Rosenbrock function (f15)

The Rosenbrock function, also referred to as the Valley or Banana function, is a popular test problem for gradient-based optimization algorithms. The maximum and minimum range is between (−15, 15).

$$f(x) = \mathop \sum \limits_{i = 1}^{d - 1} \left[ { 100(x_{i + 1} - x_{i}^{2} )^{2} + ( x_{i} - 1) ^{2} } \right]$$

The function is unimodal, and the global minimum lies in a narrow, parabolic valley. However, even though this valley is easy to find, convergence to the minimum is difficult.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ramadas, M., Pant, M., Abraham, A. et al. ssFPA/DE: an efficient hybrid differential evolution–flower pollination algorithm based approach. Int J Syst Assur Eng Manag 9, 216–229 (2018). https://doi.org/10.1007/s13198-016-0534-z

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13198-016-0534-z

Keywords

Navigation