Skip to main content

Buggy Pinball: A Novel Single-point Meta-heuristic forĀ Global Continuous Optimization

  • Conference paper
  • First Online:
Artificial Intelligence and Soft Computing (ICAISC 2022)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 13589))

Included in the following conference series:

  • 299 Accesses

Abstract

In this work, we propose a fundamentally novel single-point meta-heuristic designed for continuous optimization. Our algorithm continuously improves on a solution via a trajectory-based search inspired by the pinball arcade game in an anytime optimization manner. We evaluate our algorithm against widely employed meta-heuristics on several standard test-bed functions and various dimensions. Our algorithm exhibits high precision, and superior accuracy compared to the benchmark, especially when complex configuration spaces are considered.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 64.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 84.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The sign of the elevation angle is irrelevant as the square of its sin value is considered.

  2. 2.

    We experimented with trajectories alternating from one direction to another; however we found this took a toll in exploration. Thus, the algorithm does not behave exactly like a pinball, however, the final trajectories do resemble a pinball movement.

  3. 3.

    Crossing of the objective function means that a common point of the objective function and the current trajectory segment has been detected.

References

  1. Abdel-Basset, M., Ding, W., El-Shahat, D.: A hybrid harris hawks optimization algorithm with simulated annealing for feature selection. Artif. Intell. Rev. 54(1), 593ā€“637 (2021)

    ArticleĀ  Google ScholarĀ 

  2. Ali, M.M., Khompatraporn, C., Zabinsky, Z.B.: A numerical evaluation of several stochastic algorithms on selected continuous global optimization test problems. J. Global Optim. 31(4), 635ā€“672 (2005)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  3. Biehl, M., Schwarze, H.: Learning by on-line gradient descent. J. Phys. A: Math. Gen. 28(3), 643 (1995)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  4. Chambolle, A., Pock, T.: An introduction to continuous optimization for imaging. Acta Numer 25, 161ā€“319 (2016)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  5. Dhouib, S., Kharrat, A., Chabchoub, H.: A multi-start threshold accepting algorithm for multiple objective continuous optimization problems. Int. J. Numer. Meth. Eng. 83(11), 1498ā€“1517 (2010)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  6. Duchi, J., Hazan, E., Singer, Y.: Adaptive subgradient methods for online learning and stochastic optimization. J. Mach. Learn. Res. 12(7) (2011)

    Google ScholarĀ 

  7. Dueck, G., Scheuer, T.: Threshold accepting: A general purpose optimization algorithm appearing superior to simulated annealing. J. Comput. Phys. 90(1), 161ā€“175 (1990)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  8. Frausto-Solis, J., HernƔndez-Ramƭrez, L., Castilla-Valdez, G., GonzƔlez-Barbosa, J.J., SƔnchez-HernƔndez, J.P.: Chaotic multi-objective simulated annealing and threshold accepting for job shop scheduling problem. Math. Comput. Appli. 26(1), 8 (2021)

    Google ScholarĀ 

  9. Geiger, M.J.: Pace solver description: A simplified threshold accepting approach for the cluster editing problem. In: 16th International Symposium on Parameterized and Exact Computation (IPEC 2021). Schloss Dagstuhl-Leibniz-Zentrum fĆ¼r Informatik (2021)

    Google ScholarĀ 

  10. Grass, J., Zilberstein, S.: Anytime algorithm development tools. ACM SIGART Bulletin 7(2), 20ā€“27 (1996)

    ArticleĀ  Google ScholarĀ 

  11. Halim, A.H., Ismail, I., Das, S.: Performance assessment of the metaheuristic optimization algorithms: an exhaustive review. Artif. Intell. Rev. 54(3), 2323ā€“2409 (2021)

    ArticleĀ  Google ScholarĀ 

  12. Hochreiter, S., Younger, A.S., Conwell, P.R.: Learning to learn using gradient descent. In: Dorffner, G., Bischof, H., Hornik, K. (eds.) ICANN 2001. LNCS, vol. 2130, pp. 87ā€“94. Springer, Heidelberg (2001). https://doi.org/10.1007/3-540-44668-0_13

    ChapterĀ  Google ScholarĀ 

  13. Jeyakumar, V., Rubinov, A.M.: Continuous Optimization: Current Trends and Modern Applications, vol. 99. Springer Science & Business Media (2006). https://doi.org/10.1007/b137941

  14. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995-International Conference On Neural Networks, vol. 4, pp. 1942ā€“1948. IEEE (1995)

    Google ScholarĀ 

  15. Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671ā€“680 (1983)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  16. Lin, S.W., Cheng, C.Y., Pourhejazy, P., Ying, K.C.: Multi-temperature simulated annealing for optimizing mixed-blocking permutation flowshop scheduling problems. Expert Syst. Appl. 165, 113837 (2021)

    ArticleĀ  Google ScholarĀ 

  17. Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51ā€“67 (2016)

    ArticleĀ  Google ScholarĀ 

  18. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46ā€“61 (2014)

    ArticleĀ  Google ScholarĀ 

  19. Molga, M., Smutnicki, C.: Test functions for optimization needs. Test Funct. Optim. Needs 101, 48 (2005)

    Google ScholarĀ 

  20. Munoz, M.A., Kirley, M., Halgamuge, S.K.: The algorithm selection problem on the continuous optimization domain. In: Computational Intelligence In Intelligent Data Analysis, pp. 75ā€“89. Springer (2013). https://doi.org/10.1007/978-3-642-32378-2_6

  21. Qian, N.: On the momentum term in gradient descent learning algorithms. Neural Netw. 12(1), 145ā€“151 (1999)

    ArticleĀ  Google ScholarĀ 

  22. Shalev-Shwartz, S., Ben-David, S.: Understanding machine learning: From theory to algorithms. Cambridge University Press (2014)

    Google ScholarĀ 

  23. Siddique, N., Adeli, H.: Simulated annealing, its variants and engineering applications. Int. J. Artif. Intell. Tools 25(06), 1630001 (2016)

    ArticleĀ  Google ScholarĀ 

  24. Taylan, P., Weber, G.W., Yerlikaya, F.: Continuous optimization applied in mars for modern applications in finance, science and technology. In: ISI Proceedings of 20th Mini-euro Conference Continuous Optimization and Knowledge-based Technologies, pp. 317ā€“322. Citeseer (2008)

    Google ScholarĀ 

  25. Vanderbilt, D., Louie, S.G.: A monte carlo simulated annealing approach to optimization over continuous variables. J. Comput. Phys. 56(2), 259ā€“271 (1984)

    ArticleĀ  MATHĀ  Google ScholarĀ 

  26. Weber, G.W., ƖzĆ¶ÄŸĆ¼r-AkyĆ¼z, S., Kropat, E.: A review on data mining and continuous optimization applications in computational biology and medicine. Birth Defects Res. C Embryo Today 87(2), 165ā€“181 (2009)

    ArticleĀ  Google ScholarĀ 

  27. Xiong, Q., Jutan, A.: Continuous optimization using a dynamic simplex method. Chem. Eng. Sci. 58(16), 3817ā€“3828 (2003)

    ArticleĀ  Google ScholarĀ 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Vasileios Lymperakis .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lymperakis, V., Panagopoulos, A.A. (2023). Buggy Pinball: A Novel Single-point Meta-heuristic forĀ Global Continuous Optimization. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2022. Lecture Notes in Computer Science(), vol 13589. Springer, Cham. https://doi.org/10.1007/978-3-031-23480-4_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-23480-4_22

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-23479-8

  • Online ISBN: 978-3-031-23480-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics