Abstract
In the world of new optimization methods, there is a concern that various methods, despite having different names, are quite similar. This raises a crucial question: Does the introduction of a new source of inspiration justify assigning a new name to an optimization algorithm, especially when its functionality closely mirrors or simplifies an existing, well-known method? This paper takes a close look at the Grasshopper Optimization Algorithm (GOA), investigating its concepts and comparing them to different versions of Particle Swarm Optimization (PSO). Our findings lead to a noteworthy conclusion: GOA, despite its branding as a novel algorithm, is not a new algorithm, but can be viewed as a derivative of PSO.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ardizzon, G., Cavazzini, G., Pavesi, G.: Adaptive acceleration coefficients for a new search diversification strategy in particle swarm optimization algorithms. Inf. Sci. 299, 337–378 (2015). https://doi.org/10.1016/j.ins.2014.12.024
de Armas, J., Lalla-Ruiz, E., Tilahun, S.L., Voß, S.: Similarity in metaheuristics: a gentle step towards a comparison methodology. Nat. Comput. 21(2), 265–287 (2022). https://doi.org/10.1007/s11047-020-09837-9
Van den Bergh, F., Engelbrecht, A.P.: A convergence proof for the particle swarm optimiser. Fund. Inform. 105(4), 341–374 (2010). https://doi.org/10.3233/FI-2010-370
Blackwell, T.: A study of collapse in bare bones particle swarm optimization. IEEE Trans. Evol. Comput. 16(3), 354–372 (2011). https://doi.org/10.1109/TEVC.2011.2136347
Camacho-Villalón, C.L., Dorigo, M., Stützle, T.: Why the intelligent water drops cannot be considered as a novel algorithm. In: Dorigo, M., Birattari, M., Blum, C., Christensen, A.L., Reina, A., Trianni, V. (eds.) ANTS 2018. LNCS, vol. 11172, pp. 302–314. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00533-7_24
Camacho-Villalón, C.L., Dorigo, M., Stützle, T.: The intelligent water drops algorithm: why it cannot be considered a novel algorithm: a brief discussion on the use of metaphors in optimization. Swarm Intell. 13, 173–192 (2019). https://doi.org/10.1007/s11721-019-00165-y
Camacho Villalón, C.L., Stützle, T., Dorigo, M.: Grey wolf, firefly and bat algorithms: three widespread algorithms that do not contain any novelty. In: Dorigo, M., et al. (eds.) ANTS 2020. LNCS, vol. 12421, pp. 121–133. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60376-2_10
Chen, K., Zhou, F., Yin, L., Wang, S., Wang, Y., Wan, F.: A hybrid particle swarm optimizer with sine cosine acceleration coefficients. Inf. Sci. 422, 218–241 (2018). https://doi.org/10.1016/j.ins.2017.09.015
Clerc, M., Kennedy, J.: The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evol. Comput. 6(1), 58–73 (2002). https://doi.org/10.1109/4235.985692
Digalakis, J.G., Margaritis, K.G.: On benchmarking functions for genetic algorithms. Int. J. Comput. Math. 77(4), 481–506 (2001). https://doi.org/10.1080/00207160108805080
Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, MHS 1995, pp. 39–43. IEEE (1995). https://doi.org/10.1109/MHS.1995.494215
Harris, C.R., et al.: Array programming with NumPy. Nature 585(7825), 357–362 (2020). https://doi.org/10.1038/s41586-020-2649-2
Harrison, K.R., Engelbrecht, A.P., Ombuki-Berman, B.M.: Self-adaptive particle swarm optimization: a review and analysis of convergence. Swarm Intell. 12, 187–226 (2018). https://doi.org/10.1007/s11721-017-0150-9
Hayward, L., Engelbrecht, A.: Determining metaheuristic similarity using behavioural analysis. IEEE Trans. Evol. Comput. (2023). https://doi.org/10.1109/TEVC.2023.3346672
Hayward, L., Engelbrecht, A.: How to tell a fish from a bee: constructing meta-heuristic search behaviour characteristics. In: Proceedings of the Companion Conference on Genetic and Evolutionary Computation, pp. 1562–1569 (2023). https://doi.org/10.1145/3583133.3596338
Hunter, J.D.: Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9(3), 90–95 (2007). https://doi.org/10.1109/MCSE.2007.55
Kennedy, J.: The particle swarm: social adaptation of knowledge. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 1997), pp. 303–308. IEEE (1997). https://doi.org/10.1109/ICEC.1997.592326
Kennedy, J.: Bare bones particle swarms. In: Proceedings of the 2003 IEEE Swarm Intelligence Symposium, SIS 2003 (Cat. No. 03EX706), pp. 80–87. IEEE (2003). https://doi.org/10.1109/SIS.2003.1202251
Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995). https://doi.org/10.1109/ICNN.1995.488968
Kennedy, J., Mendes, R.: Neighborhood topologies in fully informed and best-of-neighborhood particle swarms. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 36(4), 515–519 (2006). https://doi.org/10.1109/TSMCC.2006.875410
Kudela, J.: A critical problem in benchmarking and analysis of evolutionary computation methods. Nat. Mach. Intell. 4(12), 1238–1245 (2022). https://doi.org/10.1038/s42256-022-00579-0
Kudela, J.: The evolutionary computation methods no one should use. arXiv preprint arXiv:2301.01984 (2023)
Mendes, R., Kennedy, J., Neves, J.: The fully informed particle swarm: simpler, maybe better. IEEE Trans. Evol. Comput. 8(3), 204–210 (2004). https://doi.org/10.1109/TEVC.2004.826074
Molga, M., Smutnicki, C.: Test functions for optimization needs. Test Functions for Optimization Needs 101, 48 (2005)
Piotrowski, A.P., Napiorkowski, J.J., Rowinski, P.M.: How novel is the “novel’’ black hole optimization approach? Inf. Sci. 267, 191–200 (2014). https://doi.org/10.1016/j.ins.2014.01.026
Ratnaweera, A., Halgamuge, S.K., Watson, H.C.: Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Trans. Evol. Comput. 8(3), 240–255 (2004). https://doi.org/10.1109/TEVC.2004.826071
Reback, J., McKinney, W., et al.: Pandas: powerful Python data analysis toolkit. pandas.pydata.org (2020). https://pandas.pydata.org/
Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017). https://doi.org/10.1016/j.advengsoft.2017.01.004
Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No. 98TH8360), pp. 69–73. IEEE (1998). https://doi.org/10.1109/ICEC.1998.699146
Shi, Y., Eberhart, R.C.: Empirical study of particle swarm optimization. In: Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), vol. 3, pp. 1945–1950. IEEE (1999). https://doi.org/10.1109/CEC.1999.785511
Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets. http://www.sfu.ca/~ssurjano. Accessed 13 Mar 2024
Swan, J., et al.: A research agenda for metaheuristic standardization. In: Proceedings of the XI Metaheuristics International Conference, pp. 1–3. Citeseer (2015)
Tripathi, P.K., Bandyopadhyay, S., Pal, S.K.: Multi-objective particle swarm optimization with time variant inertia and acceleration coefficients. Inf. Sci. 177(22), 5033–5049 (2007). https://doi.org/10.1016/j.ins.2007.06.018
Van Den Bergh, F.: An Analysis of Particle Swarm Optimizers. University of Pretoria (South Africa) (2001)
Wang, D., Tan, D., Liu, L.: Particle swarm optimization algorithm: an overview. Soft. Comput. 22, 387–408 (2018). https://doi.org/10.1007/s00500-016-2474-6
Waskom, M.L.: Seaborn: statistical data visualization. J. Open Source Softw. 6(60), 3021 (2021). https://doi.org/10.21105/joss.03021
Weyland, D.: A rigorous analysis of the harmony search algorithm: how the research community can be misled by a “novel’’ methodology. Int. J. Appl. Metaheuristic Comput. (IJAMC) 1(2), 50–60 (2010). https://doi.org/10.4018/jamc.2010040104
Weyland, D.: A critical analysis of the harmony search algorithm—how not to solve sudoku. Oper. Res. Perspect. 2, 97–105 (2015). https://doi.org/10.1016/j.orp.2015.04.001
Yang, X.S.: Test problems in optimization. arXiv preprint arXiv:1008.0549 (2010)
Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evol. Comput. 3(2), 82–102 (1999). https://doi.org/10.1109/4235.771163
Acknowledgements
The work of J.V. was supported by a grant from the Special Research Fund (BOF) of Ghent University (BOF/STA/202109/039).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Harandi, N., Van Messem, A., De Neve, W., Vankerschaver, J. (2024). Grasshopper Optimization Algorithm (GOA): A Novel Algorithm or A Variant of PSO?. In: Hamann, H., et al. Swarm Intelligence. ANTS 2024. Lecture Notes in Computer Science, vol 14987. Springer, Cham. https://doi.org/10.1007/978-3-031-70932-6_7
Download citation
DOI: https://doi.org/10.1007/978-3-031-70932-6_7
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-70931-9
Online ISBN: 978-3-031-70932-6
eBook Packages: Computer ScienceComputer Science (R0)