Abstract
In most engineering problems, experiments for evaluating the performance of different setups are time consuming, expensive, or even both. Therefore, sequential experimental designs have become an indispensable technique for optimizing the objective functions of these problems. In this context, most of the problems can be considered as a black-box. Specifically, no function properties are known a priori to select the best suited surrogate model class. Therefore, we propose a new ensemble-based approach, which is capable of identifying the best surrogate model during the optimization process by using reinforcement learning techniques. The procedure is general and can be applied to arbitrary ensembles of surrogate models. Results are provided on 24 well-known black-box functions to show that the progressive procedure is capable of selecting suitable models from the ensemble and that it can compete with state-of-the-art methods for sequential optimization.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The weak global structure of this function is rooted in the existence of two basins of almost the same size.
References
Bartz-Beielstein, T., Lasarczyk, C.G., Preuss, M.: Sequential parameter optimization. In: McKay, B., et al. (eds.) Proceedings of the 2005 Congress on Evolutionary Computation (CEC’05), Edinburgh, Scotland, pp. 773–780. IEEE Press, Los Alamitos (2005)
Bischl, B., Lang, M., Mersmann, O., Rahnenfuehrer, J., Weihs, C.: BatchJobs and BatchExperiments: abstraction mechanisms for using R in batch environments. Submitted to Journal of Statistical Software (2012a)
Bischl, B., Mersmann, O., Trautmann, H., Weihs, C.: Resampling methods for meta-model validation with recommendations for evolutionary computation. Evol. Comput. 20(2), 249–275 (2012b)
Bursztyn, D., Steinberg, D.M.: Comparison of designs for computer experiments. J. Stat. Planning Infer. 136(3), 1103–1119 (2006)
DaCosta, L., Fialho, A., Schoenauer, M., Sebag, M.: Adaptive operator selection with dynamic multi-armed bandits. In: Proceedings of the 10th Conference Genetic and Evolutionary Computation (GECCO ’08), pp. 913–920. ACM, New York (2008)
Friedman, J.: Multivariate adaptive regression splines. Ann. Stat. 19(1), 1–67 (1991)
Friedman, J.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29(5), 1189–1232 (2001)
Friese, M., Zaefferer, M., Bartz-Beielstein, T., Flasch, O., Koch, P., Konen, W., Naujoks, B.: Ensemble based optimization and tuning algorithms. In: Hoffmann, F., Hüllermeier, E. (eds.) Proceedings of the 21. Workshop Computational Intelligence, pp. 119–134 (2011)
Ginsbourger, D., Helbert, C., Carraro, L.: Discrete mixtures of kernels for kriging-based optimization. Qual. Reliab. Eng. Int. 24(6), 681–691 (2008)
Goel, T., Haftka, R.T., Shyy, W., Queipo, N.V.: Ensemble of surrogates. Struct. Multidisc. Optim. 33(3), 199–216 (2007)
Gorissen, D., Dhaene, T., Turck, F.: Evolutionary model type selection for global surrogate modeling. J. Mach. Learn. Res. 10, 2039–2078 (2009)
Hansen, L., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)
Hansen, N., Finck, S., Ros, R., Auger, A.: Real-Parameter Black-Box Optimization Benchmarking 2009: Noiseless Functions Definitions. Tech. Rep. RR-6829, INRIA (2009). http://hal.inria.fr/inria-00362633/en/
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer Series in Statistics. Springer, New York (2009)
Jones, D.R.: A taxonomy of global optimization methods based on response surfaces. J. Global Optim. 21(4), 345–383 (2001)
Jones, D., Schonlau, M., Welch, W.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)
Kass, R.E., Raftery, A.E.: Bayes factors. J. Am. Stat. Assoc. 90(430), 773–795 (1995)
Lenth, R.V.: Response-surface methods in R, using rsm. J. Stat. Softw. 32(7), 1–17 (2009)
Liaw, A., Wiener, M.: Classification and regression by randomForest. R News 2(3), 18–22 (2002)
Lim, D., Ong, Y.S., Jin, Y., Sendhoff, B.: A study on metamodeling techniques, ensembles, and multi-surrogates in evolutionary computation. In: Thierens, D., et al. (eds.) Proceedings of the 9th Annual Genetic and Evolutionary Computation Conference (GECCO 2007), pp. 1288–1295. ACM, New York (2007)
Mersmann, O., Bischl, B.: soobench: Single Objective Optimization Benchmark Functions (2012). http://CRAN.R-project.org/package=soobench, R package version 1.0-73
Milborrow, S.: earth: Multivariate Adaptive Regression Spline Models (2012). http://CRAN.R-project.org/package=earth, R package version 3.2-3
Mockus, J.B., Tiesis, V., Zilinskas, A.: The application of bayesian methods for seeking the extremum. In: Dixon, L.C.W., Szegö, G.P. (eds.) Towards Global Optimization 2, pp. 117–129. Elsevier North-Holland, New York (1978)
Myers, R.H., Montgomery, D.C., Anderson-Cook, C.M.: Response Surface Methodology, 3rd edn. Wiley, Hoboken (2009)
Picheny, V., Wagner, T., Ginsbourger, D.: A benchmark of kriging-based infill criteria for noisy optimization. Struct. Multidisc. Optim. 48(3), 607–626 (2013)
R Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna (2012). http://www.R-project.org/ ISBN 3-900051-07-0
Ridgeway, G.: gbm: Generalized Boosted Regression Models (2012). http://CRAN.R-project.org/package=gbm, R package version 1.6-3.2
Roustant, O., Ginsbourger, D., Deville, Y.: DiceKriging, DiceOptim: two R packages for the analysis of computer experiments by kriging-based metamodeling and optimization. J. Stat. Softw. 51(1), 1–55 (2012). http://www.jstatsoft.org/v51/i01/
Santner, T., Williams, B., Notz, W.: The Sesign and Analysis of Computer Experiments. Springer, New York (2003)
Sasena, M.J., Papalambros, P., Goovaerts, P.: Exploration of metamodeling sampling criteria for constrained global optimization. Eng. Optim. 34(3), 263–278 (2002)
Shan, S., Wang, G.G.: Survey of modeling and optimization strategies to solve high-dimensional design problems with computationally-expensive black-box functions. Struct. Multi. Optim. 41(2), 219–241 (2010)
Sutton, R., Barto, A.: Reinforcement Learning: An Introduction. Cambridge University Press, Cambridge (1998)
Therneau, T.M., port by Brian Ripley, B.A.R.: rpart: Recursive Partitioning (2012). http://CRAN.R-project.org/package=rpart, R package version 3.1-54
Venables, W.N., Ripley, B.D.: Modern Applied Statistics with S, 4th edn. Springer, New York (2002)
Viana, F.A.C.: Multiple Surrogates for Prediction and Optimization. Ph.D. thesis, University of Florida (2011)
Wagner, T., Emmerich, M., Deutz, A., Ponweiser, W.: On expected-improvement criteria for model-based multi-objective optimization. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN XI. LNCS, vol. 6238, pp. 718–727. Springer, Heidelberg (2010)
Wichard, J.D.: Model selection in an ensemble framework. In: International Joint Conference on Neural Networks, pp. 2187–2192 (2006)
Acknowledgements.
This paper is based on investigations of the project D5 of the Collaborative Research Center SFB/TR TRR 30 and of the project C2 of the Collaborative Research Center SFB 823, which are kindly supported by the Deutsche Forschungsgemeinschaft (DFG).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hess, S., Wagner, T., Bischl, B. (2013). PROGRESS: Progressive Reinforcement-Learning-Based Surrogate Selection. In: Nicosia, G., Pardalos, P. (eds) Learning and Intelligent Optimization. LION 2013. Lecture Notes in Computer Science(), vol 7997. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-44973-4_13
Download citation
DOI: https://doi.org/10.1007/978-3-642-44973-4_13
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-44972-7
Online ISBN: 978-3-642-44973-4
eBook Packages: Computer ScienceComputer Science (R0)