Abstract
Almost all optimization algorithms have algorithm-dependent parameters, and the setting of such parameter values can significantly influence the behavior of the algorithm under consideration. Thus, proper parameter tuning should be carried out to ensure that the algorithm used for optimization performs well and is sufficiently robust for solving different types of optimization problems. In this study, the Firefly Algorithm (FA) is used to evaluate the influence of its parameter values on its efficiency. Parameter values are randomly initialized using both the standard Monte Carlo method and the Quasi Monte-Carlo method. The values are then used for tuning the FA. Two benchmark functions and a spring design problem are used to test the robustness of the tuned FA. From the preliminary findings, it can be deduced that both the Monte Carlo method and Quasi-Monte Carlo method produce similar results in terms of optimal fitness values. Numerical experiments using the two different methods on both benchmark functions and the spring design problem showed no major variations in the final fitness values, irrespective of the different sample values selected during the simulations. This insensitivity indicates the robustness of the FA.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cagnina, L.C., Esquivel, S.C., Coello, C.A.: Solving engineering optimization problems with the simple constrained particle swarm optimizer. Informatica 32(3), 319–326 (2008)
de Lacerda, M.G.P., de Lima Neto, F.B., Ludermir, T.B., Kuchen, H.: Out-of-the-box parameter control for evolutionary and swarm-based algorithms with distributed reinforcement learning. Swarm Intel. 17, 173–217 (2023)
Srivastava, P.R., Malikarjun, B., Yang, X.-S.: Optimal test sequence generation using firefly algorithm. Swarm Evol. Comput. 8, 44–53 (2013)
He, Z., Wang, X.: Convergence analysis of quasi-Monte Carlo sampling for quantile and expected shortfall. Math. Comput. 90(327), 303–319 (2021)
Hong, H.S., Hickernell, F.J.: Algorithm 823: implementing scrambled digital sequences. ACM Trans. Math. Softw. 29(2), 95–109 (2003)
Jamil, M., Yang, X.-S.: A literature survey of benchmark functions for global optimization problems. Int. J. Math. Model. Numer. Optim. 4(2), 150–194 (2013)
Rosenbrock, H.H.: An automatic method for finding the greatest or least value of a function. Comput. J. 175–184 (1960)
Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3(2), 124–141 (1999)
Yang, X.-S., Deb, S., Loomes, M., Karamanoglu, M.: A framework for self-tuning optimization algorithm. Neural Comput. Appl. 23, 2051–2057 (2013)
Tatsis, V.A., Parsopoulos, K.E.: Dynamic parameter adaptation in metaheuristics using gradient approximation and line search. Appl. Soft Comput. 74, 368–384 (2019)
Yoo, Y.: Hyperpameter optimization of deep neural network using univariate dynamic encoding algorithm for searches. Knowl.-Based Syst. 178(1), 74–83 (2019)
Fishman, G.S.: Monte Carlo: Concepts, Algorithms and Applications. Springer, New York (1996). https://doi.org/10.1007/978-1-4757-2553-7
Yang, X.-S., He, X.-S.: Mathematical Foundations of Nature-Inspired Algorithms. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-16936-7
Yang, X.-S.: Introduction to Computational Mathematics, 2nd edn. World Scientific Publishing Company, Singapore (2014)
Joy, G., Huyck, C., Yang, X.-S.: Review of parameter tuning methods for nature-inspired algorithms. In: Yang, X.-S. (ed.) Benchmarks and Hybrid Algorithms in Optimization and Applications. Springer Tracts in Nature-Inspired Computing, pp. 33–47. Springer, Singapore (2023). https://doi.org/10.1007/978-981-99-3970-1_3
Sobol, I.M.: A Primer for the Monte Carlo Method. CRC Press, Boca Raton (2017)
Yang, X.-S., Slowik, A.: Firefly algorithm (chapter 13). In: Swarm Intelligence Algorithms. Swarm Intelligence Algorithms: Modifications and Applications. CRC Press, Boca Raton (2020)
Yang, X.-S.: Nature-Inspired Optimization Algorithms, 2nd edn. Academic Press, London (2020)
Yang, X.-S.: Firefly algorithms for multimodal optimization. In: Watanabe, O., Zeugmann, T. (eds.) SAGA 2009. LNCS, vol. 5792, pp. 169–178. Springer, Heidelberg (2009). https://doi.org/10.1007/978-3-642-04944-6_14
Senthilnath, J., Omkar, S.N., Mani, V.: Clustering using firefly algorithm: performance study. Swarm Evol. Comput. 1(3), 164–171 (2011)
Osaba, E., Yang, X.-S., Diaz, F., Onieva, E., Masegosa, A., Perallo, A.: A discrete firefly algorithm to solve a rich vehicle routing problem modelling a newspaper distribution system with recycling policy. Soft. Comput. 21(18), 5295–5308 (2017)
Palmieri, N., Yang, X.-S., Rango, F.D., Santmaria, A.F.: Self-adaptive decision-making mechanisms to balance the execution of multiple tasks for a multi-robots team. Neurocomputing 306(1), 17–36 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Joy, G., Huyck, C., Yang, XS. (2024). Parameter Tuning of the Firefly Algorithm by Standard Monte Carlo and Quasi-Monte Carlo Methods. In: Franco, L., de Mulatier, C., Paszynski, M., Krzhizhanovskaya, V.V., Dongarra, J.J., Sloot, P.M.A. (eds) Computational Science – ICCS 2024. ICCS 2024. Lecture Notes in Computer Science, vol 14836. Springer, Cham. https://doi.org/10.1007/978-3-031-63775-9_17
Download citation
DOI: https://doi.org/10.1007/978-3-031-63775-9_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-63774-2
Online ISBN: 978-3-031-63775-9
eBook Packages: Computer ScienceComputer Science (R0)