Skip to main content

Grasshopper Optimization Algorithm (GOA): A Novel Algorithm or A Variant of PSO?

  • Conference paper
  • First Online:
Swarm Intelligence (ANTS 2024)

Abstract

In the world of new optimization methods, there is a concern that various methods, despite having different names, are quite similar. This raises a crucial question: Does the introduction of a new source of inspiration justify assigning a new name to an optimization algorithm, especially when its functionality closely mirrors or simplifies an existing, well-known method? This paper takes a close look at the Grasshopper Optimization Algorithm (GOA), investigating its concepts and comparing them to different versions of Particle Swarm Optimization (PSO). Our findings lead to a noteworthy conclusion: GOA, despite its branding as a novel algorithm, is not a new algorithm, but can be viewed as a derivative of PSO.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Ardizzon, G., Cavazzini, G., Pavesi, G.: Adaptive acceleration coefficients for a new search diversification strategy in particle swarm optimization algorithms. Inf. Sci. 299, 337–378 (2015). https://doi.org/10.1016/j.ins.2014.12.024

    Article  Google Scholar 

  2. de Armas, J., Lalla-Ruiz, E., Tilahun, S.L., Voß, S.: Similarity in metaheuristics: a gentle step towards a comparison methodology. Nat. Comput. 21(2), 265–287 (2022). https://doi.org/10.1007/s11047-020-09837-9

    Article  MathSciNet  Google Scholar 

  3. Van den Bergh, F., Engelbrecht, A.P.: A convergence proof for the particle swarm optimiser. Fund. Inform. 105(4), 341–374 (2010). https://doi.org/10.3233/FI-2010-370

    Article  MathSciNet  Google Scholar 

  4. Blackwell, T.: A study of collapse in bare bones particle swarm optimization. IEEE Trans. Evol. Comput. 16(3), 354–372 (2011). https://doi.org/10.1109/TEVC.2011.2136347

    Article  Google Scholar 

  5. Camacho-Villalón, C.L., Dorigo, M., Stützle, T.: Why the intelligent water drops cannot be considered as a novel algorithm. In: Dorigo, M., Birattari, M., Blum, C., Christensen, A.L., Reina, A., Trianni, V. (eds.) ANTS 2018. LNCS, vol. 11172, pp. 302–314. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00533-7_24

    Chapter  Google Scholar 

  6. Camacho-Villalón, C.L., Dorigo, M., Stützle, T.: The intelligent water drops algorithm: why it cannot be considered a novel algorithm: a brief discussion on the use of metaphors in optimization. Swarm Intell. 13, 173–192 (2019). https://doi.org/10.1007/s11721-019-00165-y

    Article  Google Scholar 

  7. Camacho Villalón, C.L., Stützle, T., Dorigo, M.: Grey wolf, firefly and bat algorithms: three widespread algorithms that do not contain any novelty. In: Dorigo, M., et al. (eds.) ANTS 2020. LNCS, vol. 12421, pp. 121–133. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-60376-2_10

    Chapter  Google Scholar 

  8. Chen, K., Zhou, F., Yin, L., Wang, S., Wang, Y., Wan, F.: A hybrid particle swarm optimizer with sine cosine acceleration coefficients. Inf. Sci. 422, 218–241 (2018). https://doi.org/10.1016/j.ins.2017.09.015

    Article  MathSciNet  Google Scholar 

  9. Clerc, M., Kennedy, J.: The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE Trans. Evol. Comput. 6(1), 58–73 (2002). https://doi.org/10.1109/4235.985692

    Article  Google Scholar 

  10. Digalakis, J.G., Margaritis, K.G.: On benchmarking functions for genetic algorithms. Int. J. Comput. Math. 77(4), 481–506 (2001). https://doi.org/10.1080/00207160108805080

    Article  MathSciNet  Google Scholar 

  11. Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory. In: Proceedings of the Sixth International Symposium on Micro Machine and Human Science, MHS 1995, pp. 39–43. IEEE (1995). https://doi.org/10.1109/MHS.1995.494215

  12. Harris, C.R., et al.: Array programming with NumPy. Nature 585(7825), 357–362 (2020). https://doi.org/10.1038/s41586-020-2649-2

    Article  Google Scholar 

  13. Harrison, K.R., Engelbrecht, A.P., Ombuki-Berman, B.M.: Self-adaptive particle swarm optimization: a review and analysis of convergence. Swarm Intell. 12, 187–226 (2018). https://doi.org/10.1007/s11721-017-0150-9

    Article  Google Scholar 

  14. Hayward, L., Engelbrecht, A.: Determining metaheuristic similarity using behavioural analysis. IEEE Trans. Evol. Comput. (2023). https://doi.org/10.1109/TEVC.2023.3346672

  15. Hayward, L., Engelbrecht, A.: How to tell a fish from a bee: constructing meta-heuristic search behaviour characteristics. In: Proceedings of the Companion Conference on Genetic and Evolutionary Computation, pp. 1562–1569 (2023). https://doi.org/10.1145/3583133.3596338

  16. Hunter, J.D.: Matplotlib: a 2D graphics environment. Comput. Sci. Eng. 9(3), 90–95 (2007). https://doi.org/10.1109/MCSE.2007.55

    Article  Google Scholar 

  17. Kennedy, J.: The particle swarm: social adaptation of knowledge. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 1997), pp. 303–308. IEEE (1997). https://doi.org/10.1109/ICEC.1997.592326

  18. Kennedy, J.: Bare bones particle swarms. In: Proceedings of the 2003 IEEE Swarm Intelligence Symposium, SIS 2003 (Cat. No. 03EX706), pp. 80–87. IEEE (2003). https://doi.org/10.1109/SIS.2003.1202251

  19. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995). https://doi.org/10.1109/ICNN.1995.488968

  20. Kennedy, J., Mendes, R.: Neighborhood topologies in fully informed and best-of-neighborhood particle swarms. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 36(4), 515–519 (2006). https://doi.org/10.1109/TSMCC.2006.875410

    Article  Google Scholar 

  21. Kudela, J.: A critical problem in benchmarking and analysis of evolutionary computation methods. Nat. Mach. Intell. 4(12), 1238–1245 (2022). https://doi.org/10.1038/s42256-022-00579-0

    Article  Google Scholar 

  22. Kudela, J.: The evolutionary computation methods no one should use. arXiv preprint arXiv:2301.01984 (2023)

  23. Mendes, R., Kennedy, J., Neves, J.: The fully informed particle swarm: simpler, maybe better. IEEE Trans. Evol. Comput. 8(3), 204–210 (2004). https://doi.org/10.1109/TEVC.2004.826074

    Article  Google Scholar 

  24. Molga, M., Smutnicki, C.: Test functions for optimization needs. Test Functions for Optimization Needs 101, 48 (2005)

    Google Scholar 

  25. Piotrowski, A.P., Napiorkowski, J.J., Rowinski, P.M.: How novel is the “novel’’ black hole optimization approach? Inf. Sci. 267, 191–200 (2014). https://doi.org/10.1016/j.ins.2014.01.026

    Article  Google Scholar 

  26. Ratnaweera, A., Halgamuge, S.K., Watson, H.C.: Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Trans. Evol. Comput. 8(3), 240–255 (2004). https://doi.org/10.1109/TEVC.2004.826071

    Article  Google Scholar 

  27. Reback, J., McKinney, W., et al.: Pandas: powerful Python data analysis toolkit. pandas.pydata.org (2020). https://pandas.pydata.org/

  28. Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017). https://doi.org/10.1016/j.advengsoft.2017.01.004

    Article  Google Scholar 

  29. Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: 1998 IEEE International Conference on Evolutionary Computation Proceedings. IEEE World Congress on Computational Intelligence (Cat. No. 98TH8360), pp. 69–73. IEEE (1998). https://doi.org/10.1109/ICEC.1998.699146

  30. Shi, Y., Eberhart, R.C.: Empirical study of particle swarm optimization. In: Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), vol. 3, pp. 1945–1950. IEEE (1999). https://doi.org/10.1109/CEC.1999.785511

  31. Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: test functions and datasets. http://www.sfu.ca/~ssurjano. Accessed 13 Mar 2024

  32. Swan, J., et al.: A research agenda for metaheuristic standardization. In: Proceedings of the XI Metaheuristics International Conference, pp. 1–3. Citeseer (2015)

    Google Scholar 

  33. Tripathi, P.K., Bandyopadhyay, S., Pal, S.K.: Multi-objective particle swarm optimization with time variant inertia and acceleration coefficients. Inf. Sci. 177(22), 5033–5049 (2007). https://doi.org/10.1016/j.ins.2007.06.018

    Article  MathSciNet  Google Scholar 

  34. Van Den Bergh, F.: An Analysis of Particle Swarm Optimizers. University of Pretoria (South Africa) (2001)

    Google Scholar 

  35. Wang, D., Tan, D., Liu, L.: Particle swarm optimization algorithm: an overview. Soft. Comput. 22, 387–408 (2018). https://doi.org/10.1007/s00500-016-2474-6

    Article  Google Scholar 

  36. Waskom, M.L.: Seaborn: statistical data visualization. J. Open Source Softw. 6(60), 3021 (2021). https://doi.org/10.21105/joss.03021

    Article  Google Scholar 

  37. Weyland, D.: A rigorous analysis of the harmony search algorithm: how the research community can be misled by a “novel’’ methodology. Int. J. Appl. Metaheuristic Comput. (IJAMC) 1(2), 50–60 (2010). https://doi.org/10.4018/jamc.2010040104

    Article  Google Scholar 

  38. Weyland, D.: A critical analysis of the harmony search algorithm—how not to solve sudoku. Oper. Res. Perspect. 2, 97–105 (2015). https://doi.org/10.1016/j.orp.2015.04.001

    Article  MathSciNet  Google Scholar 

  39. Yang, X.S.: Test problems in optimization. arXiv preprint arXiv:1008.0549 (2010)

  40. Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evol. Comput. 3(2), 82–102 (1999). https://doi.org/10.1109/4235.771163

    Article  Google Scholar 

Download references

Acknowledgements

The work of J.V. was supported by a grant from the Special Research Fund (BOF) of Ghent University (BOF/STA/202109/039).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Negin Harandi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Harandi, N., Van Messem, A., De Neve, W., Vankerschaver, J. (2024). Grasshopper Optimization Algorithm (GOA): A Novel Algorithm or A Variant of PSO?. In: Hamann, H., et al. Swarm Intelligence. ANTS 2024. Lecture Notes in Computer Science, vol 14987. Springer, Cham. https://doi.org/10.1007/978-3-031-70932-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-70932-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-70931-9

  • Online ISBN: 978-3-031-70932-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics