Surrogate-guided differential evolution algorithm for high dimensional expensive problems

https://doi.org/10.1016/j.swevo.2019.04.009Get rights and content

Abstract

Engineering optimization problems usually involve computationally expensive simulations and massive design variables. Solving these problems in an efficient manner is still a big challenge. Recently, surrogate-assisted metaheuristic algorithms have been widely studied and are considered to have potential to solve such engineering optimization problems. In this paper, a surrogate-guided differential evolution algorithm is proposed to further improve the optimization efficiency for these problems. Unlike other surrogate-assisted metaheuristic algorithms, it makes a fusion between the differential evolution algorithm and surrogates, which are not just taken as an additional tool to accelerate the convergence of metaheuristic algorithms. Specifically, the proposed algorithm combines the optima predicted by the global and local surrogates with the mutation operator and makes them guide the mutation direction of the differential evolution algorithm, which thus makes the differential evolution algorithm converge fast. A simple surrogate prescreening strategy is also proposed to further improve its optimizing efficiency. In order to validate the proposed algorithm, it is tested by a lot of high dimensional numerical benchmark problems whose dimensions vary from 20 to 200 and is applied to an optimal design of a stepped cantilever beam and an optimal design of bearings in an all-direction propeller. An overall comparison between the proposed algorithm and other optimization algorithms has been made. The results show that the proposed algorithm is promising for optimizing the high dimensional expensive problems especially for the problems whose dimensions are more than 30.

Introduction

Metaheuristic algorithms such as the genetic algorithm (GA) [1], particle swarm optimization (PSO) [2], and differential evolution (DE) [3] are popular and widely applied in engineering optimization. Previous studies [[4], [5], [6], [7], [8], [9], [10]] show that these algorithms can handle high dimensional optimization problems well. In the optimizing process, however, the function calls are usually very high. They could reach hundreds of thousands. This can be acceptable because these problems are usually very cheap and the time to make a function call can be negligible. However, if these metaheuristic algorithms are applied to the engineering optimization problems which involve computationally expensive simulations, the computational cost will be tremendous and even prohibitive. According to Simpson et al. [11], Ford Motor Company spends about 36–160 h running one car crash simulation, which means a function call in the numerical optimization problems. One promising approach to reduce computation time for optimizing highly time-consuming problems is to employ computationally cheap approximation models (surrogates) to replace in part the computationally expensive exact function evaluations. However, with the “curse of dimensionality”, the common surrogates such as kriging [12], the radial basis function (RBF) [13], support vector regression (SVR) [14] and polynomial response surface (PRS) [15] usually cannot obtain a high accuracy for problems whose dimensions are more than ten according to the previous studies [[16], [17], [18]]. The optimizing methods [[19], [20], [21]] based on them are thus not suitable for handling high dimensional problems. Besides, the huge metamodeling time of kriging could be another reason to make the kriging-based optimizing methods unsuitable for high dimensional problems [22,23].

Recently, surrogate-assisted metaheuristic algorithms [24,25] have received increasing attention for addressing such high dimensional expensive optimization problems whose dimensions are usually more than 20. These algorithms usually take the metaheuristic algorithms as the main optimizing framework and take the surrogates as an additional tool to accelerate the convergence of the basic metaheuristic algorithms. They usually do not have a high accuracy requirement for surrogates. According to the ways of accelerating convergence, the surrogate-assisted metaheuristic algorithms can be mainly classified into three types: metaheuristic algorithms assisted by surrogate prescreening, metaheuristic algorithms assisted by the optimum of the global surrogate and metaheuristic algorithms assisted by surrogate-based local search.

Metaheuristic algorithms assisted by surrogate prescreening are widely used for optimizing expensive high dimensional problems. These algorithms are mainly based on the kriging surrogate (also called as Gaussian model). They usually utilize the kriging-based sampling infilling criteria such as expected improvement (EI) [26,27], probability of improvement (PoI) [28], and lower confidence bound (LCB) [[29], [30], [31], [32]] to prescreen out some promising candidate offspring points produced by the basic metaheuristic algorithms. Among these algorithms, the recently proposed LCB-assisted DE algorithm (called as GPEME) [29] shows high efficiency in optimization for medium-scale expensive problems. Besides, some algorithms are based on predicted response of surrogates. They usually use the surrogate's response function to sort the candidate offspring points produced by basic metaheuristic algorithms and then select the promising points as the offspring points. For example, Fonseca et al. [33] used the surrogate as fitness inheritance to assist a genetic algorithm (GA) in solving optimization problems with a limited computational budget. Regis [34] utilized an RBF surrogate to identify the most promising trial position for each particle in the swarm. Mallipeddi and Lee [35] used surrogates to generate the competitive offspring points among the trial offspring points. Gong et al. [36] used a cheap density function model to select the most promising candidate offspring point among a set of candidate offspring points produced by multiple offspring reproduction operators. Vincenzi and Gambarelli [37] used a scoring function which considers both the surrogate's predicted responses and the distance of the candidate points to select the potential offspring points of the DE algorithm. Moreover, some approaches are based on comparing the candidate point's predicted response provided by surrogates with its parent point's exact response and then deciding whether the candidate points should be evaluated by the exact response function. For example, Praveen and Duvigneau [38] proposed an RBF-assisted PSO algorithm for aerodynamic shape design. It can use the surrogate to screen some more promising particles among the candidate offspring swarm particles in each optimization iteration. A similar algorithm was also proposed by Sun et al. [39]. The difference is that they use an accurate two-layer surrogate construction scheme for improving the optimizing efficiency. Jin et al. [40] used the local ensemble surrogate to generate the competitive trial vectors in the DE optimization process. Elasayad et al. [41] used the kriging surrogate to select the parameters of the DE algorithm suitably in order to accelerate its convergence. What's more, Sun et al. [42] proposed a surrogate-assisted cooperative swarm optimization algorithm (SA-COSO). It uses the global RBF surrogate to screen the best candidate point of the social learning particle swarm optimization. The SA-COSO algorithm shows high optimization efficiency for high dimensional problems whose dimensions can reach 200. It can be learnt from the metaheuristic algorithms assisted by surrogate prescreening that their optimization efficiency could be highly related to that of basic metaheuristic algorithms. This is because the surrogate-prescreening strategy does not change the search mechanism of basic metaheuristic algorithms, but it improves the quality of offspring population points. Surrogates are usually taken as an additional tool to accelerate the convergence of metaheuristic algorithms.

The metaheuristic algorithms assisted by the optimum of the global surrogate usually use the predicted optimum provided by the global surrogate to replace the current best population point if the predicted optimum is better than the current best population point. For example, in the research of Ong et al. [43], the global RBF surrogate can help to accelerate the convergence of the GA to a certain degree if it is accurately constructed. Parno et al. [44] used a Kriging surrogate to improve the efficiency of PSO. Tang et al. [45] used a hybrid global surrogate model consisting of a quadratic polynomial and an RBF model to develop a surrogate-based PSO algorithm. Safari et al. [46] used the optimum of high dimensional model representation to guide PSO to search in a fast way. Tsoukalas et al. [47] used the optimum of an acquisition function, which accounts for the surrogate's response as well as the spread of all population points, to accelerate the convergence of evolutionary annealing simplex algorithm. Because it could be very difficult to find the accurate optimum of a surrogate in high dimensional space, some researchers including Regis [34] and Yu et al. [48] tried to build a local surrogate around the current best point and used the predicted optimum of the surrogate found in the local region to accelerate the convergence of PSO. The metaheuristic algorithms assisted by surrogate-based local search usually use the surrogate-based local search firstly and then use the search mechanism of basic metaheuristic algorithms for global optimization. For example, Ong et al. [43] employed a trust-region method for an interleaved use of exact models for the objective and constrained functions with computationally cheap RBF surrogates during the local search. Then the GA operator was run for global optimization. The local search based on trust region method and surrogates is very common in surrogate-assisted metaheuristic algorithms. They can be read elsewhere [[49], [50], [51], [52], [53], [54], [55]]. The recently proposed GA assisted by surrogate-based trust region local search (GS-SOMA) can be seen previous research [54]. It uses the ensemble of multiple surrogates to obtain a robust and accurate approximation in the optimizing process. Besides, the research also shows that the GA assisted by surrogate-based trust region local search can be developed to solve multi-objective optimization problems. The optimization efficiency of the metaheuristic algorithms assisted by the optima of surrogates could be highly related to the accuracy of the built surrogates. Generally, an accurate surrogate tends to provide useful optimum information and it thus directs the metaheuristic algorithms to search in an efficient way. It should be noted that only a brief summary of the surrogate-assisted metaheuristic algorithms is made in this research. The more comprehensive overview of surrogate-assisted algorithms can be found elsewhere [24,[56], [57], [58]].

Though the current surrogate-assisted metaheuristic algorithms can handle the high dimensional expensive problems well, most of them in the optimization process still need a large number of function evaluations usually higher than thousands in order to obtain a good optimizing result. Besides, they are usually developed for optimizing problems whose dimensions are usually lower than 30. For example, the GS-SOMA algorithm proposed by Lim et al. [54] needs 8000 exact function evaluations for 30-dimensional problems. The surrogate-assisted differential evolution algorithm (ESMDE) proposed by Mallipeddi and Lee [35] needs more than 10,000 function evaluations for 30 dimensional problems. Therefore, how to optimize high dimensional problems especially for the problems whose dimensions are more than 50 in an efficient way is the main motivation of this study. In order to further improve the optimization efficiency for high dimensional expensive problems, an efficient surrogate-guided differential evolution algorithm is proposed. Unlike other surrogate-assisted metaheuristic algorithms, it makes a fusion between surrogates and the differential evolution algorithm. Surrogates are not just taken an additional tool to accelerate the convergence of basic metaheuristic algorithms. Specifically, the optima predicted by the global and local surrogates are combined with the mutation operator, and they guide the mutation direction of the differential evolution algorithm. Besides, a simple surrogate prescreening strategy is employed to further improve the optimizing efficiency. To validate the performance of the proposed algorithm, it is tested by a lot of problems whose dimensions vary from 20 to 200. Results show the proposed algorithm is promising for optimizing high dimensional expensive problems especially for those with dimensions of more than 30.

The remainder of this work is organized as follows. In Section 2, the background theories involved in the proposed algorithm are presented. In Section 3, the proposed surrogate-guided DE algorithm is presented. The experimental results and discussions are presented in Section 4 and Section 5 concludes the paper.

Section snippets

Radial basis function

In this study, the RBF surrogate [20] is used in the optimization process of the proposed algorithm. Some studies [16,59,60] show that RBF usually can obtain a more accurate approximation for high dimensional problems compared with other common surrogates including PRS, kriging and SVR. Another merit of RBF is that its metamodeling speed is fast when compared with the kriging or Gaussian model. The RBF surrogate is defined as follows:

Given n distinct points x1,x2,,xnRD and the function values

Surrogate-guided differential evolution optimization

In this section, a generalization of the surrogate-guided differential evolution optimization algorithm for high dimensional expensive problems is presented. The framework is depicted in Fig. 1. The framework illustrated here is based on the JADE algorithm without optional external archive. The proposed algorithm is named as S-JADE. The main differences between S-JADE and JADE are that S-JADE uses the surrogate-guided mutation and surrogate-guided selection mechanisms in the optimizing process.

Experimental study and discussion

In order to evaluate the performance of the proposed algorithm, several widely used unimodal and multimodal benchmark problems are adopted. The dimension (D) of these problems varies from 20 to 200. Their characteristics are listed in Table 1. They are from ‘‘CEC 2005’’ [66], part A of ‘‘CEC 2014’’ [67], part B of ‘‘CEC 2014’’ [68] and the research papers of Lim et al. [54], Liu et al. [29], Liu et al. [30], and Sun et al. [42].

Conclusions

This paper introduces a new efficient surrogate-guided differential evolution algorithm for optimization of computationally high dimensional expensive problems. The proposed algorithm is based on the basic JADE algorithm. Compared with JADE, it adds to use the optimum information provided by both the global surrogate built in the whole design space and the local surrogates built in the neighbor region around each current population point. In addition, a simple surrogate-guided selection

Acknowledgment

Financial support from the National Natural Science Foundation of China under Grant No.51805180, No.51435009 and No.51721092, Natural Science Foundation of Hubei Province under Grant No. 2018CFA078, and Program for HUST Academic Frontier Youth Team is gratefully acknowledged.

References (76)

  • H. Yu et al.

    Surrogate-assisted hierarchical particle swarm optimization

    Inf. Sci.

    (2018)
  • S. Das et al.

    Recent advances in differential evolution–an updated survey

    Swarm Evol. Comput.

    (2016)
  • Y. Wang et al.

    Differential evolution based on covariance matrix learning and bimodal distribution parameter setting

    Appl. Soft Comput.

    (2014)
  • G. Wu et al.

    Differential evolution with multi-population based ensemble of mutation strategies

    Inf. Sci.

    (2016)
  • R.V. Rao et al.

    Teaching–learning-based optimization: a novel method for constrained mechanical design optimization problems

    Comput. Aided Des.

    (2011)
  • S. Mirjalili

    Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm

    Knowl. Based Syst.

    (2015)
  • Y.P. Xiong et al.

    A general linear mathematical model of power flow analysis and control for integrated structure–control systems

    J. Sound Vib.

    (2003)
  • M. Mitchell

    An Introduction to Genetic Algorithms

    (1998)
  • R. Eberhart et al.

    A new optimizer using particle swarm theory

  • R. Storn et al.

    Differential evolution–a simple and efficient metaheuristic for global optimization over continuous spaces

    J. Glob. Optim.

    (1997)
  • Y.W. Leung et al.

    An orthogonal genetic algorithm with quantization for global numerical optimization

    IEEE Trans. Evol. Comput.

    (2001)
  • J.T. Tsai et al.

    Hybrid Taguchi-genetic algorithm for global numerical optimization

    IEEE Trans. Evol. Comput.

    (2004)
  • J.J. Jamian et al.

    Global particle swarm optimization for high dimension numerical functions analysis

    J. Appl. Math.

    (2014)
  • A.K. Qin et al.

    Differential evolution algorithm with strategy adaptation for global numerical optimization

    IEEE Trans. Evol. Comput.

    (2009)
  • W. Gong et al.

    Adaptive ranking mutation operator based differential evolution for constrained optimization

    IEEE Trans. Cybern.

    (2015)
  • T.W. Simpson et al.

    Approximation methods in multidisciplinary analysis and optimization: a panel discussion

    Struct. Multidiscip. Optim.

    (2004)
  • X. Cai et al.

    Metamodeling for high dimensional design problems by multi-fidelity simulations

    Struct. Multidiscip. Optim.

    (2017)
  • A.J. Smola et al.

    A tutorial on support vector regression

    Stat. Comput.

    (2004)
  • G.G. Wang

    Adaptive response surface method using inherited Latin hypercube design points

    Trans.-Am. Soc. Mech. Eng., J. Mech. Des.

    (2003)
  • R. Jin et al.

    Comparative studies of metamodeling techniques under multiple modeling criteria

    Struct. Multidiscip. Optim.

    (2001)
  • S. Shan et al.

    Development of adaptive rbf-hdmr model for approximating high dimensional problems

  • S. Shan et al.

    Metamodeling for high dimensional simulation-based design problems

    J. Mech. Des.

    (2010)
  • D.R. Jones et al.

    Efficient global optimization of expensive black-box functions

    J. Glob. Optim.

    (1998)
  • H.M. Gutmann

    A radial basis function method for global optimization

    J. Glob. Optim.

    (2001)
  • L. Wang et al.

    Mode-pursuing sampling method for global optimization on expensive black-box functions

    Eng. Optim.

    (2004)
  • G.H. Cheng et al.

    Trust region based mode pursuing sampling method for global optimization of high dimensional design problems

    J. Mech. Des.

    (2015)
  • H. Wang et al.

    Committee-based active learning for surrogate-assisted particle swarm optimization of expensive problems

    IEEE Trans. Cybern.

    (2017)
  • M.T. Emmerich et al.

    Single-and multiobjective evolutionary optimization assisted by Gaussian random field metamodels

    IEEE Trans. Evol. Comput.

    (2006)
  • Cited by (0)

    View full text