Abstract
We regard the problem of maximizing a OneMax-like function defined over an alphabet of size r. In previous work [GECCO 2016] we have investigated how three different mutation operators influence the performance of Randomized Local Search (RLS) and the (1+1) Evolutionary Algorithm. This work revealed that among these natural mutation operators none is superior to the other two for any choice of r. We have also given in [GECCO 2016] some indication that the best achievable run time for large r is \(\varTheta (n \log r (\log n + \log r))\), regardless of how the mutation operator is chosen, as long as it is a static choice (i.e., the distribution used for variation of the current individual does not change over time).
Here in this work we show that we can achieve a better performance if we allow for adaptive mutation operators. More precisely, we analyze the performance of RLS using a self-adjusting mutation strength. In this algorithm the size of the steps taken in each iteration depends on the success of previous iterations. That is, the mutation strength is increased after a successful iteration and it is decreased otherwise. We show that this idea yields an expected optimization time of \(\varTheta (n (\log n + \log r))\), which is optimal among all comparison-based search heuristics. This is the first time that self-adjusting parameter choices are shown to outperform static choices on a discrete multi-valued optimization problem.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsNotes
- 1.
Following the terminology introduced in [5, Sect. 3.1] we distinguish between functionally-dependent and self-adjusting parameter choices. While functionally-dependent parameter choices depend only on the current state of the algorithm, they may explicitly use absolute fitness values. Fitness-dependent mutation rates are a typical example for such functionally-dependent parameter choices. Self-adjusting parameter choices, in contrast, do not depend on absolute fitness information but rather on the success of previous iterations. This is the case of the parameter updates of the \({\text {RLS}} _{a,b}\) considered in this work.
References
Auger, A., Doerr, B.: Theory of Randomized Search Heuristics. World Scientific, Singapore (2011)
Auger, A., Hansen, N.: Linear convergence on positively homogeneous functions of a comparison based step-size adaptive randomized search: the (1+1) ES with generalized one-fifth success rule. CoRR, abs/1310.8397 (2013). http://arxiv.org/abs/1310.8397
Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the leadingones problem. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6238, pp. 1–10. Springer, Heidelberg (2010)
Dietzfelbinger, M., Rowe, J.E., Wegener, I., Woelfel, P.: Tight bounds for blind search on the integers and the reals. Comb. Probab. Comput. 19, 711–728 (2010)
Doerr, B., Doerr, C.: Optimal parameter choices through self-adjustment: applying the 1/5-th rule in discrete settings. In: Proceedings of the ACM Genetic and Evolutionary Computation Conference (GECCO 2015), pp. 1335–1342. ACM (2015)
Doerr, B., Doerr, C., Ebel, F.: From black-box complexity to designing newgenetic algorithms. Theor. Comput. Sci. 567, 87–104 (2015)
Doerr, B., Doerr, C., Kötzing, T.: The right mutation strength for multi-valued decision variables. In: Proceedings of the ACM Genetic and Evolutionary Computation Conference (GECCO 2016). ACM (2016, to appear). http://arxiv.org/abs/1604.03277
Doerr, B., Doerr, C., Yang, J.: Optimal parameter choices via precise black-box analysis. In: Proceedings of the ACM Genetic and Evolutionary Computation Conference (GECCO 2016). ACM (2016, to appear)
Doerr, B., Goldberg, L.A.: Adaptive drift analysis. Algorithmica 65, 224–250 (2013)
Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64, 673–697 (2012)
Droste, S., Jansen, T., Wegener, I.: Upper and lower bounds for randomized search heuristics in black-box optimization. Theor. Comput. Syst. 39, 525–544 (2006)
Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary. IEEE Trans. Evol. Comput. 3, 124–141 (1999)
Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Heidelberg (2003)
Hansen, N., Gawelczyk, A., Ostermeier, A.: Sizing the population with respect to the local progress in (1,\( \lambda \))-evolution strategies - a theoretical analysis. In: Proceedings of the IEEE Congress on Evolutionary Computation (CEC 1995), pp. 80–85. IEEE (1995)
Jägersküpper, J.: Rigorous runtime analysis of the (1+1) ES: 1/5-rule and ellipsoidal fitness landscapes. In: Wright, A.H., Vose, M.D., De Jong, K.A., Schmitt, L.M. (eds.) FOGA 2005. LNCS, vol. 3469, pp. 260–281. Springer, Heidelberg (2005)
Jägersküpper, J.: Oblivious randomized direct search for real-parameter optimization. In: Halperin, D., Mehlhorn, K. (eds.) ESA 2008. LNCS, vol. 5193, pp. 553–564. Springer, Heidelberg (2008)
Karafotias, G., Hoogendoorn, M., Eiben, A.: Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans. Evol. Comput. 19, 167–187 (2015)
Kötzing, T.: Concentration of first hitting times under additive drift. Algorithmica 75, 490–506 (2016)
Lässig, J., Sudholt, D.: Adaptive population models for offspring populations and parallel evolutionary algorithms. In: Proceedings of the ACM Workshop on Foundations of Genetic Algorithms (FOGA 2011), pp. 181–192. ACM (2011)
Rudolph, G.: An evolutionary algorithm for integer programming. In: Davidor, Y., Schwefel, H.-P., Mönner, R. (eds.) (PPSN 1994). LNCS, pp. 139–148. Springer, Heidelberg (1994)
Acknowledgments
This research benefited from the support of the “FMJH Program Gaspard Monge in optimization and operation research”, and from the support to this program from EDF (Électricité de France).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Doerr, B., Doerr, C., Kötzing, T. (2016). Provably Optimal Self-adjusting Step Sizes for Multi-valued Decision Variables. In: Handl, J., Hart, E., Lewis, P., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds) Parallel Problem Solving from Nature – PPSN XIV. PPSN 2016. Lecture Notes in Computer Science(), vol 9921. Springer, Cham. https://doi.org/10.1007/978-3-319-45823-6_73
Download citation
DOI: https://doi.org/10.1007/978-3-319-45823-6_73
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-45822-9
Online ISBN: 978-3-319-45823-6
eBook Packages: Computer ScienceComputer Science (R0)