Skip to main content
Log in

A surrogate-assisted evolutionary algorithm with clustering-based sampling for high-dimensional expensive blackbox optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Many practical problems involve the optimization of computationally expensive blackbox functions. The computational cost resulting from expensive function evaluations considerably limits the number of true objective function evaluations allowed in order to find a good solution. In this paper, we propose a clustering-based surrogate-assisted evolutionary algorithm, in which a clustering-based local search technique is embedded into the radial basis function surrogate-assisted evolutionary algorithm framework to obtain sample points which might be close to the local solutions of the actual optimization problem. The algorithm generates sample points cyclically by the clustering-based local search, which takes the cluster centers of the ultimate population obtained by the differential evolution iterations applied to the surrogate model in one cycle as new sample points, and these new sample points are added into the initial population for the differential evolution iterations of the next cycle. In this way the exploration and the exploitation are better balanced during the search process. To verify the effectiveness of the present algorithm, it is compared with four state-of-the-art surrogate-assisted evolutionary algorithms on 24 synthetic test problems and one application problem. Experimental results show that the present algorithm outperforms other algorithms on most synthetic test problems and the application problem.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Algorithm 1
Algorithm 2
Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Data availability

The data associated with this paper is available from the corresponding author upon request.

References

  1. Arbelaitz, O., Gurrutxaga, I., Muguerza, J., Pérez, J., Perona, I.: An extensive comparative study of cluster validity indices. Pattern Recognit. 46(1), 243–256 (2013)

    Article  Google Scholar 

  2. Arthur, D., Vassilvitskii, S.: K-means++: the advantages of careful seeding. Proc. Annu. ACM-SIAM Symp. Discrete Algorithms 8, 1027–1035 (2007)

    MathSciNet  Google Scholar 

  3. Beers, W., Kleijnen, J.: Kriging interpolation in simulation: a survey. In: Proceedings of the Winter Simulation Conference, vol. 1, pp. 113–121 (2004)

  4. Buche, D., Schraudolph, N., Koumoutsakos, P.: Accelerating evolutionary algorithms with Gaussian process fitness function models. IEEE Trans. Syst. Man Cybern. Part C (Applications and Reviews) 35, 183–194 (2005)

    Article  Google Scholar 

  5. Buhmann, M.: Radial basis functions. Acta Numer. 9, 1–38 (2000)

    Article  MathSciNet  Google Scholar 

  6. Cai, X., Gao, L., Li, X., Qiu, H.: Surrogate-guided differential evolution algorithm for high dimensional expensive problems. Swarm Evolut. Comput. 48, 288–311 (2019)

    Article  Google Scholar 

  7. Calinski, T., Harabasz, J.: A dendrite method for cluster analysis. Commun. Stat. 3, 1–27 (1974)

    MathSciNet  Google Scholar 

  8. Cheng, G., Younis, A., Hajikolaei, K., Wang, G.: Trust region based mode pursuing sampling method for global optimization of high dimensional design problems. J. Mech. Des. 137, 021407 (2015)

    Article  Google Scholar 

  9. Dam, E.V., Husslage, B., Hertog, D.D., Melissen, H.: Maximin Latin hypercube designs in two dimensions. Oper. Res. 55, 158–169 (2007)

    Article  MathSciNet  Google Scholar 

  10. Das, S., Mullick, S., Suganthan, P.: Recent advances in differential evolution—an updated survey. Swarm Evolut. Comput. 27, 1–30 (2016)

    Article  Google Scholar 

  11. Diaz-Manriquez, A., Toscano Pulido, G., Coello, C.: Comparison of metamodeling techniques in evolutionary algorithms. Soft Comput. 21, 5647–5663 (2017)

    Article  Google Scholar 

  12. Emmerich, M., Giannakoglou, K., Naujoks, B.: Single and multiobjective evolutionary optimization assisted by Gaussian random field metamodels. IEEE Trans. Evolut. Comput. 10, 421–439 (2006)

    Article  Google Scholar 

  13. Friedman, J.H.: Multivariate adaptive regression splines. Ann. Stat. 19, 1–141 (1991)

    MathSciNet  Google Scholar 

  14. Giannakoglou, K.: Design of optimal aerodynamic shapes using stochastic optimization methods and computational intelligence. Progress Aerosp. Sci. 38, 43–76 (2002)

    Article  Google Scholar 

  15. Gutmann, H.M.: A radial basis function method for global optimization. J. Glob. Optim. 19(3), 201–227 (2001)

    Article  MathSciNet  Google Scholar 

  16. Hartigan, J., Wong, M.: Algorithm AS 136: a \(k\)-means clustering algorithm. J. R. Stat. Soc. Ser. C (Appl. Stat.) 28(1), 100–108 (1979)

    Google Scholar 

  17. Ji, X., Zhang, Y., Gong, D., Sun, X.: Dual-surrogate assisted cooperative particle swarm optimization for expensive multimodal problems. IEEE Trans. Evolut. Comput. 25, 794–808 (2021)

    Article  Google Scholar 

  18. Jin, R., Chen, W., Simpson, T.: Comparative studies of metamodeling techniques under multiple modeling criteria. Struct. Multidiscip. Optim. 23, 1–13 (2001)

    Article  Google Scholar 

  19. Kazemi, M., Wang, G., Rahnamayan, S., Gupta, K.: Metamodel-based optimization for problems with expensive objective and constraint functions. J. Mech. Des. 133, 014505 (2011)

    Article  Google Scholar 

  20. Li, F., Cai, X., Gao, L., Shen, W.: A surrogate-assisted multiswarm optimization algorithm for high-dimensional computationally expensive problems. IEEE Trans. Cybern. 51, 1390–1402 (2021)

    Article  Google Scholar 

  21. Liang, J.J., Qu, B.Y., Suganthan, P.N.: Problem definitions and evaluation criteria for the CEC 2014 special session and competition on single objective real-parameter numerical optimization. Technical Report 201311, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China, and Technical Report, Nanyang Technological University, Singapore (2013)

  22. Lloyd, B.G.R.: Support vector machines for classification and regression. Analyst 135(2), 230–267 (2010)

    Article  Google Scholar 

  23. Lloyd, S.: Least squares quantization in PCM. IEEE Trans. Inf. Theory 28(2), 129–137 (1982)

    Article  MathSciNet  Google Scholar 

  24. Mullur, A., Messac, A.: Metamodeling using extended radial basis functions: a comparative approach. Eng. Comput. 21, 203–217 (2006)

    Article  Google Scholar 

  25. Ong, Y., Nair, P., Keane, A.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J. 41, 687–696 (2003)

    Article  Google Scholar 

  26. Powell, M.: The theory of radial basis function approximation in 1990. In: Light, W. (ed.) Advances in Numerical Analysis. Wavelets, Subdivision Algorithms and Radial Basis Functions, vol. 2, pp. 105–210. Oxford University Press, Oxford (1992)

    Chapter  Google Scholar 

  27. Regis, R.: Particle swarm with radial basis function surrogates for expensive black-box optimization. J. Comput. Sci. 5, 12–23 (2014)

    Article  MathSciNet  Google Scholar 

  28. Regis, R., Shoemaker, C.: Constrained global optimization of expensive black box functions using radial basis functions. J. Glob. Optim. 31, 153–171 (2005)

    Article  MathSciNet  Google Scholar 

  29. Regis, R., Shoemaker, C.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19, 497–509 (2007)

    Article  MathSciNet  Google Scholar 

  30. Regis, R., Shoemaker, C.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45, 1–27 (2012)

    MathSciNet  Google Scholar 

  31. Shahsavani, D., Grimvall, A.: An adaptive design and interpolation technique for extracting highly nonlinear response surfaces from deterministic models. Reliab. Eng. Syst. Saf. 94, 1173–1182 (2009)

    Article  Google Scholar 

  32. Storn, R., Price, K.: Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)

    Article  MathSciNet  Google Scholar 

  33. Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Tiwari, S.: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Technical Report, Nanyang Technological University, Singapore, and KanGAL Report Number 2005005 (2005)

  34. Surjanovic, S., Bingham, D.: Virtual library of simulation experiments: Test functions and datasets. http://www.sfu.ca/~ssurjano/griewank.html (2021)

  35. Tang, Y., Chen, J., Wei, J.: A surrogate-based particle swarm optimization algorithm for solving optimization problems with expensive black box functions. Eng. Optim. 45, 557–576 (2013)

    Article  MathSciNet  Google Scholar 

  36. Vali, M., Zare, M., Razavi, S.: Automatic clustering-based surrogate-assisted genetic algorithm for groundwater remediation system design. J. Hydrol. 598, 125752 (2020)

    Article  Google Scholar 

  37. Vincenzi, L., Gambarelli, P.: A proper infill sampling strategy for improving the speed performance of a surrogate-assisted evolutionary algorithm. Comput. Struct. 178, 58–70 (2017)

    Article  Google Scholar 

  38. Vincenzi, L., Savoia, M.: Coupling response surface and differential evolution for parameter identification problems. Comput. Aided Civ. Infrastruct. Eng. 30, 376–393 (2015)

    Article  Google Scholar 

  39. Wang, X., Wang, G., Song, B., Wang, P., Wang, Y.: A novel evolutionary sampling assisted optimization method for high dimensional expensive problems. IEEE Trans. Evolut. Comput. 23, 815–827 (2019)

    Article  Google Scholar 

  40. Yu, H., Tan, Y., Sun, C., Zeng, J.: Clustering-based evolution control for surrogate-assisted particle swarm optimization. In: 2017 IEEE Congress on Evolutionary Computation (CEC), pp. 503–508 (2017)

Download references

Acknowledgements

We would like to thank the two anonymous referees for their very insightful comments and suggestions that have helped to improve the presentation of this paper greatly.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Fusheng Bai.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by the National Natural Science Foundation of China (11991024), the Key Project of the Chongqing Technological Innovation and Applications Development Special Program (cstc2021jscx-jbgsX0001), the Key Project of the Chongqing Municipality Education Commission Scientific and Technological Research Program (KJZD-K202114801), and the Innovation and Development Joint Project of the Chongqing Natural Science Foundation (2022NSCQ-LZX0301).

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bai, F., Zou, D. & Wei, Y. A surrogate-assisted evolutionary algorithm with clustering-based sampling for high-dimensional expensive blackbox optimization. J Glob Optim 89, 93–115 (2024). https://doi.org/10.1007/s10898-023-01343-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-023-01343-3

Keywords

Mathematics Subject Classification

Navigation