ABSTRACT
In the last two decades, significant effort has been made to solve computationally expensive optimization problems using surrogate models. Regardless of whether surrogates are the primary drivers of an algorithm or improve the convergence of an existing method, most proposed concepts are rather specific and not very generalizable. Some important considerations are selecting a baseline optimization algorithm, a suitable surrogate methodology, and the surrogate's involvement in the overall algorithm design. This paper proposes a probabilistic surrogate-assisted framework (PSAF), demonstrating its applicability to a broad category of single-objective optimization methods. The framework injects knowledge from a surrogate into an existing algorithm through a tournament-based procedure and continuing the optimization run on the surrogate's predictions. The surrogate's involvement is determined by updating a replacement probability based on the accuracy from past iterations. A study of four well-known population-based optimization algorithms with and without the proposed probabilistic surrogate assistance indicates its usefulness in achieving a better convergence. The proposed framework enables the incorporation of surrogates into an existing optimization algorithm and, thus, paves the way for new surrogate-assisted algorithms dealing with challenges in less frequently addressed computationally expensive functions, such as different variable types, large dimensional problems, multiple objectives, and constraints.
Supplemental Material
Available for Download
p652-blank_suppl.tgz
- The GPyOpt authors. 2016. GPyOpt: A bayesian optimization framework in python. (2016). http://github.com/SheffieldML/GPyOptGoogle Scholar
- Samineh Bagheri, Wolfgang Konen, Richard Allmendinger, Jürgen Branke, Kalyanmoy Deb, Jonathan Fieldsend, Domenico Quagliarella, and Karthik Sindhya. 2017. Constraint handling in efficient global optimization. In Proceedings of the genetic and evolutionary computation conference (GECCO '17). Association for Computing Machinery, New York, NY, USA, 673--680.Google ScholarDigital Library
- Lukáš Bajer, Zbyněk Pitra, Jakub Repický, and Martin Holeňa. 2019. Gaussian process surrogate models for the CMA evolution strategy. Evolutionary Computation 27, 4 (Dec. 2019), 665--697.Google ScholarDigital Library
- P. Beaucaire, Ch. Beauthier, and C. Sainvitu. 2019. Multi-point infill sampling strategies exploiting multiple surrogate models. In GECCO '19: Proceedings of the genetic and evolutionary computation conference companion. ACM, New York, NY, USA, 1559--1567. Place: Prague, Czech Republic.Google Scholar
- Nicolas Berveglieri, Bilel Derbel, Arnaud Liefooghe, Hernán Aguirre, Qingfu Zhang, and Kiyoshi Tanaka. 2020. Designing parallelism in surrogate-assisted multiobjective optimization based on decomposition. In Proceedings of the 2020 genetic and evolutionary computation conference (GECCO '20). New York, NY, USA, 462--470. Number of pages: 9 Place: Cancún, Mexico.Google ScholarDigital Library
- Bernd Bischl, Simon Wessing, Nadja Bauer, Klaus Friedrichs, and Claus Weihs. 2014. MOI-MBO: Multiobjective infill for parallel model-based optimization. In Learning and intelligent optimization, Panos M. Pardalos, Mauricio G.C. Resende, Chrysafis Vogiatzis, and Jose L. Walteros (Eds.). Springer International Publishing, Cham, 173--186.Google Scholar
- J. Blank and K. Deb. 2020. Pymoo: Multi-objective optimization in python. IEEE Access 8 (2020), 89497--89509.Google ScholarCross Ref
- X. Cai, L. Gao, and X. Li. 2020. Efficient generalized surrogate-assisted evolutionary algorithm for high-dimensional expensive problems. IEEE Transactions on Evolutionary Computation 24, 2 (2020), 365--379.Google ScholarCross Ref
- George De Ath, Richard M. Everson, Jonathan E. Fieldsend, and Alma A. M. Rahat. 2020. e-shotgun. Proceedings of the 2020 Genetic and Evolutionary Computation Conference (June 2020). Google ScholarDigital Library
- K. Deb, R. Hussein, P. C. Roy, and G. Toscano-Pulido. 2019. A taxonomy for metamodeling frameworks for evolutionary multiobjective optimization. IEEE Transactions on Evolutionary Computation 23, 1 (2019), 104--116.Google ScholarCross Ref
- David E. Goldberg. 1989. Genetic algorithms in search, optimization and machine learning (1st ed.). Addison-Wesley Longman Publishing Co., Inc., USA.Google ScholarDigital Library
- A. Habib, H. K. Singh, T. Chugh, T. Ray, and K. Miettinen. 2019. A multiple surrogate assisted decomposition-based evolutionary algorithm for expensive Multi/Many-Objective optimization. IEEE Transactions on Evolutionary Computation 23, 6 (2019), 1000--1014.Google ScholarDigital Library
- Nikolaus Hansen. 2019. A global surrogate assisted CMA-ES. In Proceedings of the genetic and evolutionary computation conference (GECCO '19). Association for Computing Machinery, New York, NY, USA, 664--672.Google ScholarDigital Library
- N. Hansen, A. Auger, R. Ros, O. Mersmann, T. Tušar, and D. Brockhoff. 2020. COCO: A platform for comparing continuous optimizers in a black-box setting. Optimization Methods and Software (2020).Google Scholar
- Nikolaus Hansen and Andreas Ostermeier. 2001. Completely derandomized self-adaptation in evolution strategies. Evolutionary Computation 9, 2 (June 2001), 159--195.Google ScholarDigital Library
- Rolland L. Hardy. 1971. Multiquadric equations of topography and other irregular surfaces. Journal of Geophysical Research (1896--1977) 76, 8 (March 1971), 1905--1915.Google Scholar
- Yaochu Jin. 2011. Surrogate-assisted evolutionary computation: Recent advances and future challenges. Swarm and Evolutionary Computation 1, 2 (2011), 61 -- 70.Google ScholarCross Ref
- Donald R. Jones. 2001. A taxonomy of global optimization methods based on response surfaces. Journal of Global Optimization 21, 4 (2001), 345--383.Google ScholarDigital Library
- Donald R. Jones, Matthias Schonlau, and William J. Welch. 1998. Efficient global optimization of expensive black-box functions. J. of Global Optimization (1998).Google Scholar
- J. Kennedy and R. Eberhart. 1995. Particle swarm optimization. In Proceedings of ICNN'95 - international conference on neural networks, Vol. 4. 1942--1948 vol.4.Google Scholar
- J. Knowles. 2006. ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Transactions on Evolutionary Computation 10, 1 (2006), 50--66.Google ScholarDigital Library
- D. G. Krige. 1951. A statistical approach to some basic mine valuation problems on the Witwatersrand, by D.G. Krige, published in the Journal, December 1951: introduction by the author.Google Scholar
- D. Lim, Y. Jin, Y. Ong, and B. Sendhoff. 2010. Generalizing surrogate-assisted evolutionary computation. IEEE Transactions on Evolutionary Computation 14, 3 (2010), 329--355.Google ScholarDigital Library
- M. D. McKay, R. J. Beckman, and W. J. Conover. 1979. Comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21, 2 (1979), 239--245. Publisher: Taylor & Francis.Google Scholar
- Wolfgang Ponweiser, Tobias Wagner, Dirk Biermann, and Markus Vincze. 2008. Multiobjective Optimization on a Limited Budget of Evaluations Using Model-Assisted s-Metric Selection. In Parallel Problem Solving from Nature - PPSN X, Günter Rudolph, Thomas Jansen, Nicola Beume, Simon Lucas, and Carlo Poloni (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 784--794.Google Scholar
- Jörg Stork, Martina Friese, Martin Zaefferer, Thomas Bartz-Beielstein, Andreas Fischbach, Beate Breiderhoff, Boris Naujoks, and Tea Tušar. 2020. Open issues in surrogate-assisted optimization. In High-performance simulation-based optimization, Thomas Bartz-Beielstein, Bogdan Filipič, Peter Korošec, and El-Ghazali Talbi (Eds.). Springer International Publishing, Cham, 225--244.Google Scholar
- Rainer Storn. 1996. On the usage of differential evolution for function optimization. In Proceedings of north american fuzzy information processing. 519--523.Google ScholarCross Ref
- Felipe Viana and Raphael Haftka. 2010. Surrogate-based optimization with parallel simulations using the probability of improvement. In 13th AIAA/ISSMO multidisciplinary analysis optimization conference.Google Scholar
- Edward Wegman. 1990. Hyperdimensional data analysis using parallel coordinates. J. Amer. Statist. Assoc. 85 (Sept. 1990), 664--675. Google ScholarCross Ref
- Z. Zhan, J. Zhang, Y. Li, and H. S. Chung. 2009. Adaptive particle swarm optimization. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 39, 6 (2009), 1362--1381.Google ScholarDigital Library
Index Terms
- PSAF: a probabilistic surrogate-assisted framework for single-objective optimization
Recommendations
A two-layer surrogate-assisted particle swarm optimization algorithm
Like most evolutionary algorithms, particle swarm optimization (PSO) usually requires a large number of fitness evaluations to obtain a sufficiently good solution. This poses an obstacle for applying PSO to computationally expensive problems. This paper ...
Black-box optimization benchmarking of IPOP-saACM-ES and BIPOP-saACM-ES on the BBOB-2012 noiseless testbed
GECCO '12: Proceedings of the 14th annual conference companion on Genetic and evolutionary computationIn this paper, we study the performance of IPOP-saACM-ES and BIPOP-saACM-ES, recently proposed self-adaptive surrogate-assisted Covariance Matrix Adaptation Evolution Strategies. Both algorithms were tested using restarts till a total number of function ...
Bi-population CMA-ES agorithms with surrogate models and line searches
GECCO '13 Companion: Proceedings of the 15th annual conference companion on Genetic and evolutionary computationIn this paper, three extensions of the BI-population Covariance Matrix Adaptation Evolution Strategy with weighted active covariance matrix update (BIPOP-aCMA-ES) are investigated. First, to address expensive optimization, we benchmark a recently ...
Comments