Abstract
This paper describes an experimental investigation into four nature-inspired population-based continuous optimisation methods: the Bees Algorithm, Evolutionary Algorithms, Particle Swarm Optimisation, and the Artificial Bee Colony algorithm. The aim of the proposed study is to understand and compare the specific capabilities of each optimisation algorithm. For each algorithm, thirty-two configurations covering different combinations of operators and learning parameters were examined. In order to evaluate the optimisation procedures, twenty-five function minimisation benchmarks were designed by the authors. The proposed set of benchmarks includes many diverse fitness landscapes, and constitutes a contribution to the systematic study of optimisation techniques and operators. The experimental results highlight the strengths and weaknesses of the algorithms and configurations tested. The existence and extent of origin and alignment search biases related to the use of different recombination operators are highlighted. The analysis of the results reveals interesting regularities that help to identify some of the crucial issues in the choice and configuration of the search algorithms.
Similar content being viewed by others
References
Adorio EP (2005) MVF–multivariate test functions library in C for unconstrained global optimization. http://geocities.com/eadorio/mvf.pdf
Baeck T, Hoffmeister F, Schwefel HP (1991) A survey of evolution strategies. In: Proceedings fourth international conference on genetic algorithms. Morgan Kaufmann, San Mateo, pp 2–9
Balázs K, Botzheim J, Kóczy LT (2010) Comparison of various evolutionary and memetic algorithms. In: Integrated uncertainty management and applications. Springer, Berlin, pp 431–442
Bersini H, Dorigo M, Langerman S, Seront G, Gambardella L (1996) Results of the first international contest on evolutionary optimisation (1st ICEO). In: Proceedings of IEEE international conference evolutionary computation. IEEE Press, Nagoya, Japan, pp 611–615
Blackwell T, Branke J (2004) Multi-swarm optimization in dynamic environments, applications of evolutionary computing. In: Raidl GR (ed) Lecture notes in computer science, vol 3005. Springer, Berlin, pp 489–500
Bonabeau E, Dorigo M, Theraulaz G (1999) Swarm intelligence: from natural to artificial systems. Oxford University Press, New York
Dorigo M, Maniezzo V, Colorni A (1996) Ant system: optimization by a colony of cooperating agents. IEEE Trans Syst Man Cybern Part B 26(1):29–41
Dréo J, Siarry P (2004) Continuous interacting ant colony algorithm based on dense heterarchy. Future Gener Comput Syst 20:841–856
El-Abd M (2012) Performance assessment of foraging algorithms vs. evolutionary algorithms. Inf Sci 182(1):243–263
Elbeltagi E, Hegazy T, Grierson D (2005) Comparison among five evolutionary-based optimization algorithms. Adv Eng Inform 19(1):43–53
Engelbrecht AP (2005) Fundamentals of computational swarm intelligence. Wiley, Chichester
Fogel DB (2000) Evolutionary computation: toward a new philosophy of machine intelligence, 2nd edn. IEEE Press, New York
Fogel LJ, Owens AJ, Walsh MJ (1966) Artificial intelligence through simulated evolution. Wiley, New York
García-Nieto J, Alba E (2010) Restart particle swarm optimization with velocity modulation: a scalability test. Soft Comput. doi:10.1007/s00500-010-0648-1. http://sci2s.ugr.es/EAMHCO/pdf-soco/P09.pdf
Goldberg DE (1989) Genetic algorithms in search, optimisation and machine learning. Addison Wesley, Reading
Herrera F, Lozano M, Molina D (2010) Test suite for the special issue of soft computing on scalability of evolutionary algorithms and other metaheuristics for large scale continuous optimization problems. http://sci2s.ugr.es/eamhco/updated-functions1-19.pdf
Holland JH (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor
Juang CF (2004) A hybrid of genetic algorithm and particle swarm optimization for recurrent network design. IEEE Trans Syst Man Cybern Part B Cybern 34(2):997–1006
Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Technical Report-TR06, Erciyes University, Engineering Faculty, Computer Engineering Department. http://mf.erciyes.edu.tr/abc/pub/tr06_2005.pdf
Karaboga D, Akay D (2009) Artificial bee colony, harmony search and bees algorithms on numerical optimization. In: Proceedings of 5th international virtual conference on intelligent production machines and systems (IPROMS 2009). Whittles, Dunbeath, Scotland, pp 417–422
Karaboga D, Basturk B (2008) On the performance of artificial bee colony (ABC) algorithm. Appl Soft Comput 8(1):687–697
Kennedy J (2006) Swarm intelligence, handbook of nature-inspired and innovative computing. In: Zomaya A (ed). Springer, USA, pp 187–219
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of 1995 IEEE international conference on neural networks, Perth, Australia, vol 4. IEEE Press, New York, NY, pp 1942– 1948
Koza JR (1992) Genetic Programming: on the programming of computers by means of natural selection. MIT Press, Cambridge
Liao T, Montes de Oca MA, Aydin DG, Stuetzle T, Dorigo M (2011) An incremental ACOR with local search for continuous optimization problems. IRIDIA Technical report number TR/IRIDIA/2011-005. http://iridia.ulb.ac.be/IridiaTrSeries/IridiaTr2011-005r001.pdf
MacNish C (2007) Towards unbiased benchmarking of evolutionary and hybrid algorithms for real-valued optimisation. Connect Sci 19(4):361–385
Mehrabian AR, Lucas C (2006) A novel numerical optimization algorithm inspired from weed colonization. Ecol Inf 1(4):355–366
Mengshoel OJ, Goldberg DE (2008) The crowding approach to niching in genetic algorithms. Evol Comput 16(3):315–354
Molga M, Smutnicki C (2005) Test functions for optimization needs. http://www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf
Monson CK, Seppi KD (2005) Exposing origin-seeking Bias in PSO. In: Proceedings of 2005 genetic and evolutionary computation conference (GECCO), Washington, DC. ACM Press, New York, NY, pp 241–248
Olague G, Puente C (2006) Parisian evolution with honeybees for three-dimensional reconstruction. In: Proceedings of 2006 genetic and evolutionary computation conference (GECCO). ACM Press, New York, NY, pp 191–198
Passino KM (2002) Biomimicry of bacterial foraging for distributed optimisation and control. IEEE Control Syst Mag (June), pp 52– 67
Pham DT, Castellani M (2009) The bees algorithm—modelling foraging behaviour to solve continuous optimisation problems. Proc ImechE Part C 223(12):2919–2938
Pham DT, Castellani M (2010) Adaptive selection routine for evolutionary algorithms. J Syst Control Eng 224(16):623–633
Pham DT, Sholedolu M (2008), Using a hybrid PSO-bees algorithm to train neural networks for wood defect classification. In: Proceedings of 4th international virtual conference on intelligent production machines and systems (IPROMS 2008), Whittles, Dunbeath, Scotland, pp 385–390
Pham DT, Ghanbarzadeh A, Koc E, Otri S, Rahim S, Zaidi M (2005) The bees algorithm. Manufacturing Engineering Centre, Cardiff University, UK. Technical Note
Pham DT, Ghanbarzadeh A, Koç E, Otri S, Rahim S, Zaidi M (2006) The bees algorithm, a novel tool for complex optimisation problems. In: Proceedings of 2nd international virtual conference on intelligent production machines and systems (IPROMS 2006). Elsevier, Oxford, pp 454–459
Rechenberg I (1965) Cybernetic solution path of an experimental problem, library translation no. 1122, Ministry of Aviation, Royal Aircraft Establishment, Farnborough, Hants, UK
Roth M, Wicker S (2003) Termite: ad-hoc networking with stigmergy. In: Proceedings of global telecommunications conference GLOBECOM ’03, vol 5. IEEE press, San Francisco, CA, pp 2937–2941
Sato T, Hagiwara M (1997) Bee system: finding solution by a concentrated search. In: Proceedings of IEEE international conference systems, Man, & Cybernetics. IEEE Press, New York, NY, pp 3954–3959
Shi Y, Eberhart R (1998) Parameter selection in particle swarm optimization. In: Proceedings of seventh annual conference on evolutionary programming, San Diego, CA, Lecture Notes in Computer Science volume 1447, Springer, Berlin, pp 591–600
Shi XH, Lu YH, Zhou CG, Lee HP, Lin WZ, Liang YC (2003) Hybrid evolutionary algorithms based on PSO and GA. In: Proceedings IEEE congress Evolutionary computation, Canberra, Australia. IEEE Press, Piscataway, NJ, pp 2393–2399
Socha K, Dorigo M (2008) Ant colony optimisation for continuous domains. Eur J Oper Res 185:1155–1173
Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359
Suganthan PN, Hansen N, Liang JJ, Deb K, Chen YP, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the CEC, special session on real-parameter optimization. KanGAL Report 2005005:2005
Tang K, Yao X, Suganthan PN, MacNish C, Chen YP, Chen CM, Yang Z (2007) Benchmark functions for the CEC’2008 special session and competition on large scale global optimization. Technical report, Nature Inspired Computation and Applications Laboratory, USTC, Hefei, Anhui, China. http://nical.ustc.edu.cn/cec08ss.php
Tang K, Li X, Suganthan PN, Yang Z, Weise T (2009) Benchmark functions for the CEC’2010 special session and competition on large scale global optimization. Technical report, Nature Inspired Computation and Applications Laboratory, USTC, Hefei, Anhui, China., http://nical.ustc.edu.cn/cec10ss.php
Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1(1):67–82
Yang XS (2008) Nature-inspired metaheuristic algorithms. Luniver Press, UK
Yang XS, Deb S (2010) Engineering optimisation by cuckoo search. Int J Math Model Numer Optim 1(4):330–343
Zapfel G, Braune R, Bogl M (2010) Metaheuristic search concepts. Springer, Berlin
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by R. John.
Appendices
Appendix A: Optimisation benchmarks
1.1 General features
All functions are defined within the interval:
\(I =\{x\in \mathrm{R}^\mathrm{n}; -100<x_{i}<100, i=1,2,\ldots ,n\}.\)
All but functions 22, 23, and 24 are two-dimensional.
For all functions, \(f(x^{\mu })=0\) where \(x^{\mu }=(x^{\mu }_{1},\ldots ,x^{\mu }_{n})\) is the global minimum point.
1.2 Multimodal functions
1.2.1 Function 1
This function maps two “holes”. The global optimum is located at the origin of the search space.
1.2.2 Function 2
This function maps two “holes”. The global optimum is located far from the origin of the search space.
1.2.3 Function 3
This function maps nine basins of equal size arranged on a grid. The global minimum corresponds to one of the lateral basins of the grid.
1.2.4 Function 4
This function describes the same fitness landscape of function 3 rotated of 4/9\(\cdot \pi \).
1.2.5 Function 5
This function maps eleven large secondary basins, and a narrow and steep hole near the top-right corner of the search space where the global minimum is located. The secondary minima are arranged on a grid.
1.2.6 Function 6
This function maps eleven large secondary basins, and a narrow and steep hole at the origin of the search space where the global minimum is located. The secondary minima are arranged on a grid.
1.2.7 Function 7
This function maps eleven large secondary basins, and a narrow and steep hole near the top-right corner of the search space where the global minimum is located. The secondary minima are not arranged on a grid.
1.2.8 Function 8
This function maps eleven large secondary basins, and a narrow and steep hole at the origin of the search space where the global minimum is located. The secondary minima are not arranged on a grid.
1.3 Minimum surrounded by flat surface
1.3.1 Function 9
This function maps a multimodal search surface. The global minimum lies at the origin of the search space, and its basin is characterised by a large flat step and a steep and narrow ending. Four secondary minima are arranged on a grid.
1.3.2 Function 10
This function maps a multimodal search surface. The global minimum lies far from the origin of the search space, and its basin is characterised by a large flat step and a steep and narrow ending. Four secondary minima are arranged on a grid.
1.3.3 Function 11
This function maps a multimodal search surface. The global minimum lies at the origin of the search space, and its basin is characterised by a large flat step and a steep and narrow ending. Four secondary minima are not arranged on a grid.
1.3.4 Function 12
This function maps a multimodal search surface. The global minimum lies far from the origin of the search space, and its basin is characterised by a large flat step and a steep and narrow ending. Four secondary minima are not arranged on a grid.
1.4 Narrow valleys
1.4.1 Function 13
This function maps two valleys on opposite sides of the search space.
1.4.2 Function 14
This function maps two pairs of valleys of opposite slopes that represent competing basins of attraction. The minimum is located near the borders of the search surface at the end of the narrowest valley. The four valleys join at the origin, where a further basin is located.
1.4.3 Function 15
This function maps a narrow parabolic valley surrounded by a large flat surface. The valley is located in the half plane of positive x1 values (x1 \(>\) 0). The other half of the fitness landscape is covered by a sliding plane.
1.5 Wavelike
1.5.1 Function 16
This function combines two cosinusoidal functions. Each function depends on one of the two input variables, and its amplitude increases linearly with the associated variable. The global minimum is in a narrow hole that is added to one of the “pockets” of the search surface. Function 16 has an overall unimodal characteristic corresponding to a plane slanted toward the positive values of the two input variables. The optimum is far from the origin and does not correspond to the minimum of the unimodal characteristic (i.e. the slanted plane).
1.5.2 Function 17
This function combines two sinusoidal functions. The two functions have constant amplitude and variable period. The global minimum is in a narrow hole that is added to one of the “pockets” of the search surface.
1.6 “Noisy” unimodal
1.6.1 Function 18
This function has overall unimodal behaviour with a cosinusoidal noise component. The magnitude of the noise component corresponds to 10 % of that of the unimodal curve. The peak lies far from the origin.
1.6.2 Function 19
This function has an overall unimodal behaviour with a cosinusoidal noise component. The magnitude of the noise component corresponds to 25 % of that of the unimodal curve. The peak lies far from the origin.
1.6.3 Function 20
This function has an overall unimodal behaviour with a cosinusoidal noise component. The magnitude of the noise component corresponds to 40 % of that of the unimodal curve. The peak lies far from the origin.
1.6.4 Function 21
This function is similar to function 19 but the period of the cosinusoidal noise component is multiplied by a factor 4.
1.6.5 Function 22
This function is similar to function 19 but the period of the cosinusoidal noise component is divided by a factor 4.
1.7 Different dimensionality
1.7.1 Function 23
This function maps seven secondary basins, and the global minimum is located far from the origin. The secondary minima are not arranged on a grid. The function is ten-dimensional
1.7.2 Function 24
This function maps seven secondary basins, and the global minimum is located far from the origin. The secondary minima are not arranged on a grid. The function is fifteen-dimensional
1.7.3 Function 25
This function maps seven secondary basins, and the global minimum is located far from the origin. The secondary minima are not arranged on a grid. The function is twenty-dimensional
B: Algorithms configurations
Rights and permissions
About this article
Cite this article
Pham, D.T., Castellani, M. Benchmarking and comparison of nature-inspired population-based continuous optimisation algorithms. Soft Comput 18, 871–903 (2014). https://doi.org/10.1007/s00500-013-1104-9
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-013-1104-9