Skip to main content
Log in

An experimental methodology for response surface optimization methods

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Response surface methods, and global optimization techniques in general, are typically evaluated using a small number of standard synthetic test problems, in the hope that these are a good surrogate for real-world problems. We introduce a new, more rigorous methodology for evaluating global optimization techniques that is based on generating thousands of test functions and then evaluating algorithm performance on each one. The test functions are generated by sampling from a Gaussian process, which allows us to create a set of test functions that are interesting and diverse. They will have different numbers of modes, different maxima, etc., and yet they will be similar to each other in overall structure and level of difficulty. This approach allows for a much richer empirical evaluation of methods that is capable of revealing insights that would not be gained using a small set of test functions. To facilitate the development of large empirical studies for evaluating response surface methods, we introduce a dimension-independent measure of average test problem difficulty, and we introduce acquisition criteria that are invariant to vertical shifting and scaling of the objective function. We also use our experimental methodology to conduct a large empirical study of response surface methods. We investigate the influence of three properties—parameter estimation, exploration level, and gradient information—on the performance of response surface methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Abbreviations

f :

The function to be optimized. \({f:\mathcal{X}\subseteq \mathbb{R}^d \to \mathbb{R}}\)

\({\mathcal{X}}\) :

The domain of f

x, z, w :

Points in the domain of f

x j :

The jth coordinate of the point x

||x||:

The Euclidean norm of x

x :

A list of points x 1, x 2, ..., x n

x 1:k :

A sublist of points x 1, x 2, ..., x k

f (x), or f :

A list of function values f (x 1), f (x 2), ..., f (x n)

k(x, z):

The kernel function between the points x and z

k(x, z):

A column vector whose ith element is k(x i, z)

k(x, z):

A row vector whose ith element is k(x, z i)

k(x, z) = K:

A kernel matrix whose (i, j) th element is k(x i, z j)

μ(x):

The prior mean function

\({\sigma^2_f}\) :

The “signal” or “process” variance of a Gaussian process

\({\sigma^2_n}\) :

The noise variance of a Gaussian process

i :

The characteristic length-scale along the ith dimension

L:

A diagonal length-scale matrix

X :

A random variable

p(X = x):

Probability (or density) of the event Xx

E[X]:

The expectation of X

Cov(X, Y):

The covariance between X and Y: Cov(X, Y) =  E[(XE(X)) · (YE[Y])]

(a)+, [a]+ :

max(a, 0)

I :

The identity matrix

\({\mathbb{R}}\) :

The set of real numbers

Ψ(x):

The standard Gaussian right tail probability \({p(X > x),\;X \sim \mathcal{N}(0,1)}\)

EEC:

Expected Euler Characteristic

MEI:

Maximum Expected Improvement

MPI:

Maximum Probability of Improvement

ILN:

Independent Log Normal

BFGS:

Broyden-Fletcher-Goldfarb-Shanno (hill-climbing method)

References

  • Adler R.J., Taylor J.E.: Random Fields and Geometry. Springer, Berlin (2007)

    Google Scholar 

  • Audet, C., Dennis, J.J.E., Moore, D., Booker, A., Frank, P.: A surrogate-model-based method for constrained optimization. In: Proceedings of the 8th AIAA/NASA/USAF/ISSMO Symposium on Multidisciplinary Analysis and Optimization. Paper No. AIAA-2000-4891 (2000)

  • Bakker T.M.D.: Design Optimization With Kriging Models. Delft University Press, Delft (2000)

    Google Scholar 

  • Bardenet, R., Kegl, B.: Surrogating the surrogate: accelerating Gaussian-process-based global optimization with a mixture cross-entropy algorithm. In: Proceedings of the 27th Annual International Conference on Machine Learning (2010)

  • Boyle, P.: Gaussian processes for regression and optimisation. PhD thesis, Victoria University of Wellington (2006)

  • Chernova, S., Veloso, M.: An evolutionary approach to gait learning for four-legged robots. In: Intelligent Robots and Systems (2004)

  • Cox, D.D., John, S. SDO: a statistical method for global optimization. In: Multidisciplinary design optimization (Hampton, VA, 1995), pp. 315–329. SIAM, Philadelphia, PA (1997)

  • Csató L., Opper M.: Sparse representation for Gaussian process models. In: Leen, T.K., Dietterich, T.G., Tresp, V. (eds) Neural Information Processing Systems, vol. 13, The MIT Press, Cambridge, MA (2001)

    Google Scholar 

  • Deb K., Thiele L., Laumanns M., Zitzler E.: Scalable test problems for evolutionary multiobjective optimization. In: Jain, L., Wu, X., Abraham, A., Goldberg, R. (eds) Evolutionary Multiobjective Optimization., pp. 105–145. Springer, Berlin (2005)

    Chapter  Google Scholar 

  • Dixon, L., Szegö, G. (eds.): Towards Global Optimisation, Chapter The global optimisation problem: an introduction, vol. 2, pp. 1–15. North-Holland, Amsterdam (1978)

  • Elder J.F.: Global \({\mathbb{R}^d}\) optimization when probes are expensive: the GROPE algorithm. In: Proceedings IEEE International Conference on Systems, Man, and Cybernetics, Chicago, Illinois (1992)

  • Floudas C.A., Gounaris C.E.: A review of recent advances in global optimization. J. Glob. Optim. 45, 3–38 (2009)

    Article  Google Scholar 

  • Gutmann H.-M.: A radial basis function method for global optimization. J. Glob. Optim. 19, 201–227 (2001)

    Article  Google Scholar 

  • Huang D., Allen T.T., Notz W.I., Zeng N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34, 441–466 (2006)

    Article  Google Scholar 

  • Jones D., Perttunen C., Stuckman B.: Lipschitzian optimization without the lipschitz constant. Optim. Theory Appl. 79(1), 157–181 (1993)

    Article  Google Scholar 

  • Jones D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21, 345–383 (2001)

    Article  Google Scholar 

  • Jones D.R., Schonlau M., Welch W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998)

    Article  Google Scholar 

  • Khare, V., Yao, X., Deb, K.: Performance scaling of multi-objective evolutionary algorithms. In: Evolutionary Multi-Criterion Optimization, volume 2632/2003 of Lecture Notes in Computing Science. Springer, Berlin (2003)

  • Kushner H.: A new method of locating the maximum of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86, 97–106 (1964)

    Article  Google Scholar 

  • Lawrence N., Seeger M., Herbrich R.: Fast sparse gaussian process methods: the informative vector machine. In: Becker, S.T.S., Obermayer, K. (eds) Advances in Neural Information Processing Systems, vol. 15, pp. 609–616. MIT Press, Cambridge, MA (2003)

    Google Scholar 

  • Leary S.J., Bhaskar A., Keane A.J.: A derivative based surrogate model for approximating and optimizing the output of an expensive computer simulation. J. Glob. Optim. 30, 39–58 (2004)

    Article  Google Scholar 

  • Li R., Sudjianto A.: Analysis of computer experiments using penalized likelihood in Gaussian kriging models. Technometrics 47, 111–120 (2005)

    Article  Google Scholar 

  • Lizotte, D.J.: Practical Bayesian optimization. PhD thesis, University of Alberta, Alberta (2008)

  • Lizotte, D.J., Wang, T., Bowling, M., Schuurmans, D.: Automatic gait optimization with Gaussian process regression. In: Proceedings of the International Joint Conference on Artificial Intelligence (2007)

  • MATLAB: Version 7.10.0 (R2010a). The MathWorks Inc., Natick, MA (2010)

  • Mockus J.: Bayesian Approach to Global Optimization. Kluwer Academic Publishers, Dordrecht (1989)

    Book  Google Scholar 

  • Perttunen, C. D.: A nonparametric global optimization method using the rank transformation. In: Proceedings of the IEEE Conference on Systems, Man, and Cybernetics, vol. 1, pp. 888–893 (2007)

  • Rasmussen C.E.: Gaussian processes in machine learning. In: Bousquet, O., von Luxburg, U., Rätsch, G. (eds) Advanced Lectures in Machine Learning: ML Summer Schools 2003, Springer, Berlin (2004)

    Google Scholar 

  • Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 9.4 Derivative Observations. MIT Press, Cambridge (2006a)

    Google Scholar 

  • Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 5 Model Selection and Adaptation of Hyperparameters. MIT Press, Cambridge (2006b)

    Google Scholar 

  • Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 4.2 Examples of Covariance Functions. MIT Press, Cambridge (2006c)

    Google Scholar 

  • Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 2 Regression. MIT Press, Cambridge (2006d)

    Google Scholar 

  • Regis C.A.S.R.G.: Improved strategies for radial basis function methods for global optimization. J. Glob. Optim. 37(1), 113–135 (2007)

    Article  Google Scholar 

  • Sacks J., Welch W.J., Mitchell T.J., Wynn H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–435 (1989)

    Article  Google Scholar 

  • Santner T.J., Williams B.J., Notz W.I.: The Design and Analysis of Computer Experiments. Springer, Berlin (2003)

    Google Scholar 

  • Sasena, M.J.: Flexibility and efficiency enhancements for constrained global design optimization with Kriging approximations. PhD thesis, University of Michigan, Ann Arbor (2002)

  • Schonlau, M.: Computer experiments and global optimization. PhD thesis, University of Waterloo, Waterloo (1997)

  • Shawe-Taylor J., Cristiani N.: Kernel methods for pattern analysis, chapter Properties of Kernels. MIT Press, Cambridge (2004)

    Google Scholar 

  • Srinivas, N., Krause, A., Kakade, S., Seeger, M.: Gaussian process optimization in the bandit setting: No regret and experimental design. In: Proceedings of the 27th Annual International Conference on Machine Learning (2010)

  • Stuckman, B.: A global search method for optimizing nonlinear systems. In: Proceedings of the IEEE Conference on Systems, Man, and Cybernetics, vol. 1, pp 965–977 (1989)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Daniel J. Lizotte.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lizotte, D.J., Greiner, R. & Schuurmans, D. An experimental methodology for response surface optimization methods. J Glob Optim 53, 699–736 (2012). https://doi.org/10.1007/s10898-011-9732-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-011-9732-z

Keywords

Navigation