Skip to main content
Log in

Heterogeneous ensemble algorithms for function optimization

  • Published:
Applied Intelligence Aims and scope Submit manuscript

A Correction to this article was published on 23 June 2022

This article has been updated

Abstract

Ensemble optimization algorithms have been widely used in solving complex optimization problems because they can make full use of the advantages of different algorithms or strategies. However, it is very difficult to integrate multiple optimization methods with different structures in an algorithm framework because evolutionary mechanisms or coding methods are different in different evolutionary computation (EC) algorithms. To give full play to the complementary advantages of different algorithms without obviously increasing the computational complexity of the algorithm, a framework of heterogeneous ensemble algorithms (EHA) with multiple populations is developed. First, to maintain the diversities of the subpopulations, a new grouping strategy considering the convergence accuracy and acceleration is designed to divide the population into three subpopulations. This can decrease the probability of local convergence of the algorithm. Second, three algorithms considering the characteristics of the individuals are distributed for the subpopulations to balance the exploitation and exploration abilities of the framework. Third, a method for chromosome modification is designed to realize the smooth evolution of individuals in different subpopulations. Finally, the individuals are regrouped to realize the information exchange among three subpopulations. The performance of EHA is evaluated on two data suites (IEEE CEC2005 and IEEE CEC2014), and the results are compared with those of some other algorithms. The results indicate that EHA has excellent optimization performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Change history

References

  1. Salomon R (1996) Re-evaluating genetic algorithm performance under coordinate rotation of benchmark functions, a survey of some theoretical and practical aspects of genetic algorithms. Biosystems 39(3):263–278

    Article  Google Scholar 

  2. Storn R, Price K (1997) Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359

    Article  MathSciNet  MATH  Google Scholar 

  3. Kennedy J, Eberhart R (1995) Particle swarm optimization. Proc IEEE Int Conf Neural Netw 4:1942–1948

    Article  Google Scholar 

  4. Rao RV, Savsani VJ, Vakharia DP (2012) Teaching–learning-based optimization: an optimization method for continuous non-linear large scale problems. Inf Sci 183(1):1–15

    Article  MathSciNet  Google Scholar 

  5. Civicioglu P (2013) Backtracking search optimization algorithm for numerical optimization problems, Appl. Math. Comput. 219(15):8121–8144

    MathSciNet  MATH  Google Scholar 

  6. Shayanfar H, Gharehchopogh FS (2018) Farmland fertility: a new metaheuristic algorithm for solving continuous optimization problems. Appl Soft Comput 71:728–746

    Article  Google Scholar 

  7. Zhao W, Wang L, Zhang Z (2019) Supply-Demand-Based Optimization: A Novel Economics-Inspired Algorithm for Global Optimization. IEEE Access 7:73182–73206

    Article  Google Scholar 

  8. Kaveh A, Khanzadi M, Rastegar Moghaddam M (2020) Billiards-inspired optimization algorithm: a new meta-heuristic method. Structures 27:1722–1739

    Article  Google Scholar 

  9. Li SM, Chen HL, Wang MJ, Heidari AA, Mirjalili S (2020) Slime mould algorithm: A new method for stochastic optimization. Futur Gener Comput Syst 111:300–323

    Article  Google Scholar 

  10. Moosavi SHS, Bardsiri VK (2019) Poor and rich optimization algorithm: a new human-based and multi populations algorithm. Eng Appl Artif Intell 86:165–181

    Article  Google Scholar 

  11. Ramón J, Barroso D (2020) Collaborative Learning-Based Industrial IoT API Recommendation for Software-Defined Devices: The Implicit Knowledge Discovery Perspective. IEEE Trans Emerging Top Comput Intell PP(99):1–11

    Google Scholar 

  12. Zhu Y, Zhang W, Chen Y, et al. (2019) A novel approach to workload prediction using attention-based LSTM encoder-decoder network in cloud environment. J Wirel Com Netw 2019:274

    Article  Google Scholar 

  13. Huang Y, Xu H, Gao H, Ma X, Hussain W (2021) SSUR: An approach to optimizing virtual machine allocation strategy based on user requirements for cloud data center. IEEE Trans Green Commun Netw 5(2):670–681

    Article  Google Scholar 

  14. Yin Y, Cao Z, Xu Y, Gao H, Li R, Mai Z (2020) Qos prediction for service recommendation with features learning in mobile edge computing environment. IEEE Trans Cogn Commun Netw 6 (4):1136–1145

    Article  Google Scholar 

  15. Brest J, Greiner S, Boskovic B, et al. (2006) Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems. IEEE Trans Evol Comput 10(6):646–657

    Article  Google Scholar 

  16. Liang JJ, Qin AK, Suganthan PN, Baskar S (2006) Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans Evol Comput 10(3):281–295

    Article  Google Scholar 

  17. Peram T, Veeramachaneni K, Mohan CK (2003) Fitness-distance-ratio based particle swarm optimization. In: Proceedings of Swarm intelligence symp., pp 174–181

  18. Rao RV, Patel V (2012) An elitist teaching–learning-based optimization algorithm for solving complex constrained optimization problems. Int J Ind Eng Comput 3(4):535–560

    Google Scholar 

  19. Chen DB, Zou F, Lu RQ, Wang P (2017) Learning backtracking search optimization algorithm and its application. Inf Sci 376:71–94

    Article  Google Scholar 

  20. Qin AK, Huang VL, Suganthan PN (2009) Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans Evol Comput 13(2):398–417

    Article  Google Scholar 

  21. Mallipeddi R, Suganthan PN, Pan Q-K, Tasgetiren MF (2011) Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl Soft Comput 11(2):1679–1696

    Article  Google Scholar 

  22. Wang H, Wu ZJ, Rahnamayan S, et al. (2014) Multi-strategy ensemble artificial bee colony algorithm. Inf Sci 279:587–603

    Article  MathSciNet  MATH  Google Scholar 

  23. Ghoumari A, Nakib A, Siarry P (2018) Evolutionary algorithm with ensemble strategies based on maximum a posteriori for continuous optimization. Inf Sci 460-461:1–22

    Article  Google Scholar 

  24. Gui L, Xia XW et al (2019) A multi-role based differential evolution. Swarm Evol Comput 50:100508

  25. Liu XF, Zhan ZH, Ying L, Chen WN, Gong YJ, Gu TL, Yuan HQ, Zhang J (2019) Historical and heuristic-based adaptive differential evolution. IEEE Trans Syst Man Cybern Syst 49 (12):2623–2635

    Article  Google Scholar 

  26. Lynn N, Suganthan PN (2017) Ensemble particle swarm optimizer. Appl Soft Comput 55:533–548

    Article  Google Scholar 

  27. Wu GH, Shen X, Li HF, Chen HK, Lin AP, Suganthan PN (2018) Ensemble of differential evolution variants. Inf Sci 423:172–186

    Article  MathSciNet  Google Scholar 

  28. Cheng JT, Wang L, Xiong Y (2019) Ensemble of cuckoo search variants. Comput Ind Eng 135:299–313

    Article  Google Scholar 

  29. Song Y, Wu DQ, Deng W, Gao XZ, Li TY, Zhang B, Li YG (2021) MPPCEDE: Multi-population parallel co-evolutionary differential evolution for parameter optimization. Energy Conver Manag 228:113661

  30. Liu HT, Du W, Guo ZX (2019) A multi-population evolutionary algorithm with single-objective guide for many-objective optimization. Inf Sci 503:39–60

    Article  MathSciNet  MATH  Google Scholar 

  31. Xia XW, Gui L, Zhang YL, et al. (2021) A fitness-based adaptive differential evolution algorithm. Inf Sci 549:116–141

    Article  MathSciNet  MATH  Google Scholar 

  32. Liu ZG, Ji XH, Yang Y, Cheng HT (2021) Multi-technique diversity-based particle-swarm optimization. Inf Sci 577:298–323

    Article  MathSciNet  Google Scholar 

  33. Storn R (1996) On the usage of differential evolution for function optimization. In: Biennial conference of the north american fuzzy information processing society (NAFIPS). IEEE, Berkeley, pp 519–523

  34. Al-Dabbagh RD, Neri F, Idris N, Baba MS (2018) Algorithm design issues in adaptive differential evolution: review and taxonomy. Swarm Evol Comput 43:284–311

    Article  Google Scholar 

  35. Neri F, Tirronen V (2010) Recent advances in differential evolution: a review and experimental analysis, artificial intelligence review. Springer 33:61–106

    Google Scholar 

  36. Shi Y, Eberhart R (1998) A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation Proceedings. In: IEEE World Congress on Computational Intelligence, Anchorage, pp 69–73

  37. Engelbrecht AP (2007) Computational intelligence: An Introduction, 2nd edn. IEEE

  38. Hop DC, Hop NV, Anh TTM (2021) Adaptive particle swarm optimization for integrated quay crane and yard truck scheduling problem. Comput Indust Engineer 153:107075

  39. Eberhart R, Shi Y (2000) Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the Congress on Evolutionary Computation, La Jolla, pp 84–88

  40. Liang JJ, Qu BY, Suganthan PN, Hernández-Díaz AG (2013) Problem Defifinitions and Evaluation Criteria for the CEC2013 Special Session on Real-Parameter Optimization, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore Technical Report 201212

  41. Suganthan PN, Hansen N, Liang JJ, Deb K, Chen Y-P, Auger A, Tiwari S (2005) Problem definitions and Evaluation criteria for the CEC special session on real-parameter optimization, kanGAL Report Number 2005005, Kanpur Genetic Algorithms Laboratory. IIT Kanpur

  42. Liang JJ, Qu BY, Suganthan PN (2013) Problem Defifinitions and Evaluation Criteria for the CEC 2014. Technical Report, pp 1–32

  43. Mendes R, Kennedy J, Neves J (2004) The fully informed particle swarm: simpler, maybe better. Appl Soft Comput 8(3):204–210

    Google Scholar 

  44. Zou F, Wang L, Hei XH, et al. (2014) Teaching-learning-based optimization with dynamic group strategy for global optimization. Inf Sci 273:112–131

    Article  Google Scholar 

  45. Hansen N, Ostermeier A (1996) Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of IEEE International Conference on Evolutionary Computation. IEEE, pp 312–317

  46. Tanabe R, Fukunaga AS (2014) Improving the search performance of SHADE using linear population size reduction. In: Proceedings of 2014 IEEE congr. Evol. Comput., CEC 2014, pp 1658–1665

  47. Pierezan J, Coelho LS (2018) Coyote optimization algorithm: a new metaheuristic for global optimization problems. Proc IEEE Congr Evol Comput:2633–2640

  48. Friedman M (1937) The use of ranks to avoid the assumption of normality implicit in the analysis of variance. J Am Stat Assoc 32:674–701

    Article  MATH  Google Scholar 

  49. Friedman M (1940) A comparison of alternative tests of significance for the problem of m rankings. Ann Math Stat 11:86–92

    Article  MathSciNet  MATH  Google Scholar 

  50. Nemenyi PB (1963) Distribution-free Multiple comparisons. Master’s thesis, Princeton University

  51. Demšar J (2006) Statistical comparisons of classififiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

Download references

Funding

This work is partially supported by the National Natural Science Foundation of China (Grant No 61976101, 62006091, 62006092). This work is also partially supported by the funding plan for scientific research activities of academic and technical leaders and reserve candidates in Anhui Province (Grant No. 2021H264) and the university Natural Science Research Project of Anhui Province (Grant No. KJ2019A0593, KJ2019A0952).

Author information

Authors and Affiliations

Authors

Contributions

Conceptualization: [Debao Chen]; Methodology: [Yu Deng]; Software: [Yu Deng], [Yuan Chen]; Writing - original draft preparation: [Yu Deng]; Writing - review and editing: [Debao Chen], [Feng Zou]; Funding acquisition: [Debao Chen], [Ying Zheng]; Visualization: [Feng Zou]; Validation: [Ying Zheng], [Minglan Fu]; Investigation: [Chun Wang];

Corresponding author

Correspondence to Debao Chen.

Ethics declarations

Conflict of Interest

All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

The original online version of this article was revised due to incorrect Funding information.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Deng, Y., Chen, D., Zou, F. et al. Heterogeneous ensemble algorithms for function optimization. Appl Intell 52, 13310–13338 (2022). https://doi.org/10.1007/s10489-022-03197-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03197-w

Keywords

Navigation