Abstract
Surrogate modelling techniques have the potential to reduce the number of objective function evaluations needed to solve black-box optimization problems. Most surrogate modelling techniques in use with evolutionary algorithms today do not preserve the desirable invariance to order-preserving transformations of objective function values of the underlying algorithms. We propose adaptive function value warping as a tool aiming to reduce the sensitivity of algorithm behaviour to such transformations.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Abbasnejad, A.: Adaptive function value warping for surrogate model assisted evolutionary optimization. Master’s thesis, Faculty of Computer Science, Dalhousie University (2021)
Bagheri, S., Konen, W., Emmerich, M., Bäck, T.: Self-adjusting parameter control for surrogate-assisted constrained optimization under limited budgets. Appl. Soft Comput. 61, 377–393 (2017)
Bajer, L., Pitra, Z., Repický, J., Holeňa, M.: Gaussian process surrogate models for the CMA evolution strategy. Evol. Comput. 27(4), 665–697 (2019)
Hansen, N.: The CMA evolution strategy: a tutorial. arxiv:1604.00772 (2016)
Hansen, N.: A global surrogate assisted CMA-ES. In: Genetic and Evolutionary Computation Conference – GECCO 2019, pp. 664–672. ACM Press (2019)
Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)
Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)
Kayhani, A., Arnold, D.V.: Design of a surrogate model assisted (1 + 1)-ES. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11101, pp. 16–28. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99253-2_2
Kern, S., Hansen, N., Koumoutsakos, P.: Local meta-models for optimization using evolution strategies. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 939–948. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_95
Kronberger, G., Kommenda, M.: Evolution of covariance functions for gaussian process regression using genetic programming. In: Moreno-Díaz, R., Pichler, F., Quesada-Arencibia, A. (eds.) EUROCAST 2013. LNCS, vol. 8111, pp. 308–315. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-53856-8_39
Loshchilov, I., Schoenauer, M., Sebag, M.: Comparison-based optimizers need comparison-based surrogates. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 364–373. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_37
Loshchilov, I., Schoenauer, M., Sebag, M.: Self-adaptive surrogate-assisted covariance matrix adaptation evolution strategy. In: Genetic and Evolutionary Computation Conference – GECCO 2012, pp. 321–328. ACM Press (2012)
Loshchilov, I., Schoenauer, M., Sebag, M.: Intensive surrogate model exploitation in self-adaptive surrogate-assisted CMA-ES. In: Genetic and Evolutionary Computation Conference – GECCO 2013, pp. 439–446. ACM Press (2013)
Regis, R.G.: Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points. Eng. Optim. 46(2), 218–243 (2014)
Repický, J., Holeňa, M., Pitra, Z.: Automated selection of covariance function for Gaussian process surrogate models. In: Krajci, S. (ed.) Information Technologies: Applications and Theory – ITAT 2018, pp. 64–71. CEUR Workshop Proceedings (2018)
Roman, I., Santana, R., Mendiburu, A., Lozano, J.A.: Evolving Gaussian process kernels from elementary mathematical expressions for time series extrapolation. Neurocomputing 462, 426–439 (2021)
Snelson, E., Rasmussen, C.E., Ghahramani, Z.: Warped Gaussian processes. In: Thrun, S., et al. (eds.) Conference on Neural Information Processing Systems – NeurIPS. pp. 337–344. MIT Press (2003)
Teytaud, O., Gelly, S.: General lower bounds for evolutionary algorithms. In: Runarsson, T.P., Beyer, H.-G., Burke, E., Merelo-Guervós, J.J., Whitley, L.D., Yao, X. (eds.) PPSN 2006. LNCS, vol. 4193, pp. 21–31. Springer, Heidelberg (2006). https://doi.org/10.1007/11844297_3
Toal, L., Arnold, D.V.: Simple surrogate model assisted optimization with covariance matrix adaptation. In: Bäck, T., et al. (eds.) PPSN 2020. LNCS, vol. 12269, pp. 184–197. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_13
Yang, J., Arnold, D.V.: A surrogate model assisted \((1+1)\)-ES with increased exploitation of the model. In: Genetic and Evolutionary Computation Conference – GECCO 2019, pp. 727–735. ACM Press (2019)
Acknowledgements
This research was supported by the Natural Sciences and Engineering Research Council of Canada (NSERC).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Abbasnejad, A., Arnold, D.V. (2022). Adaptive Function Value Warping for Surrogate Model Assisted Evolutionary Optimization. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-14714-2_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-14713-5
Online ISBN: 978-3-031-14714-2
eBook Packages: Computer ScienceComputer Science (R0)