Abstract
Surrogate model based Evolution Strategies (like the doubly trained surrogate model CMA-ES, DTS-CMA-ES) use a model of the objective function to reduce the number of function evaluations during optimization. This work investigates to use the expected selection weights averaged over the GP posterior distribution as replacement of the fitness and to guide point-selection for evaluation via the variance of the weights. Results obtained on BBOB show that the proposed technique performs on par with current strategies and allows the usage of surrogate models that are invariant to strictly increasing transformations of the function values. However, initial experiments showed that simple modeling of ranks in the GP does lead to worse results than current GP models of the function values.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
The description of the \(h_\sigma \in \{0,1\}\) mechanism is missing for brevity.
References
Auger, A., Brockhoff, D., Hansen, N.: Benchmarking the local metamodel CMA-ES on the noiseless BBOB’2013 test bed. In: Proceedings of the 15th Annual Conference Companion on Genetic and Evolutionary Computation, pp. 1225–1232 (2013)
Bajer, L., Pitra, Z., Holeňa, M.: Benchmarking gaussian processes and random forests surrogate models on the BBOB noiseless testbed. In: Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation, pp. 1143–1150 (2015)
Bajer, L., Pitra, Z., Repickỳ, J., Holeňa, M.: Gaussian process surrogate models for the CMA evolution strategy. Evol. Comput. 27(4), 665–697 (2019)
Bouzarkouna, Z., Auger, A., Ding, D.Y.: Investigating the local-meta-model CMA-ES for large population sizes. In: Di Chio, C., et al. (eds.) EvoApplications 2010. LNCS, vol. 6024, pp. 402–411. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-12239-2_42
Bouzarkouna, Z., Auger, A., Ding, D.Y.: Local-meta-model CMA-ES for partially separable functions. In: Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, pp. 869–876 (2011)
GPy: A Gaussian process framework in Python. http://github.com/SheffieldML/GPy (Since 2012)
Hansen, N.: A global surrogate assisted CMA-ES. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 664–672 (2019)
Hansen, N., Akimoto, Y., Baudis, P.: CMA-ES/pycma on Github, February 2019. https://doi.org/10.5281/zenodo.2559634
Hansen, N., Auger, A., Ros, R., Mersmann, O., Tušar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. Optim. Methods Softw. 36(1), 114–144 (2021)
Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)
Kern, S., Hansen, N., Koumoutsakos, P.: Local meta-models for optimization using evolution strategies. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 939–948. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_95
Koza, J., Tumpach, J., Pitra, Z., Holeňa, M.: Using past experience for configuration of Gaussian processes in Black-Box Optimization. In: Simos, D.E., Pardalos, P.M., Kotsireas, I.S. (eds.) LION 2021. LNCS, vol. 12931, pp. 167–182. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-92121-7_15
Le Riche, R., Picheny, V.: Revisiting Bayesian optimization in the light of the COCO benchmark. Struct. Multidiscip. Optim. 64(5), 3063–3087 (2021)
Liu, Z., et al.: Towards automated deep learning: analysis of the AutoDL challenge series 2019. In: NeurIPS 2019 Competition and Demonstration Track, pp. 242–252. PMLR (2020)
Loshchilov, I., Schoenauer, M., Sebag, M.: Comparison-based optimizers need comparison-based surrogates. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 364–373. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_37
Loshchilov, I., Schoenauer, M., Sebag, M.: Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 321–328 (2012)
Ostermeier, A., Gawelczyk, A., Hansen, N.: Step-size adaptation based on non-local use of selection information. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) PPSN 1994. LNCS, vol. 866, pp. 189–198. Springer, Heidelberg (1994). https://doi.org/10.1007/3-540-58484-6_263
Pitra, Z., Bajer, L., Holeňa, M.: Doubly trained evolution control for the surrogate CMA-ES. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 59–68. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-45823-6_6
Pitra, Z., Hanuš, M., Koza, J., Tumpach, J., Holeňa, M.: Interaction between model and its evolution control in surrogate-assisted CMA evolution strategy. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 528–536 (2021)
Turner, R., et al.: Bayesian optimization is superior to random search for machine learning hyperparameter tuning: analysis of the black-box optimization challenge 2020. In: NeurIPS 2020 Competition and Demonstration Track, pp. 3–26. PMLR (2021)
Ulmer, H., Streichert, F., Zell, A.: Evolution strategies assisted by Gaussian processes with improved preselection criterion. In: The 2003 Congress on Evolutionary Computation, 2003, CEC 2003, vol. 1, pp. 692–699. IEEE (2003)
Yang, J., Arnold, D.V.: A surrogate model assisted (1+1)-es with increased exploitation of the model. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 727–735 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Krause, O. (2022). Recombination Weight Based Selection in the DTS-CMA-ES. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13399. Springer, Cham. https://doi.org/10.1007/978-3-031-14721-0_21
Download citation
DOI: https://doi.org/10.1007/978-3-031-14721-0_21
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-14720-3
Online ISBN: 978-3-031-14721-0
eBook Packages: Computer ScienceComputer Science (R0)