Skip to main content
Log in

Two-layer adaptive surrogate-assisted evolutionary algorithm for high-dimensional computationally expensive problems

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

Surrogate-assisted evolutionary algorithms (SAEAs) have recently shown excellent ability in solving computationally expensive optimization problems. However, with the increase of dimensions of research problems, the effectiveness of SAEAs for high-dimensional problems still needs to be improved further. In this paper, a two-layer adaptive surrogate-assisted evolutionary algorithm is proposed, in which three different search strategies are adaptively executed during the iteration according to the feedback information which is proposed to measure the status of the algorithm approaching the optimal value. In the proposed method, the global GP model is used to pre-screen the offspring produced by the DE/current-to-best/1 strategy for fast convergence speed, and the DE/current-to-randbest/1 strategy is proposed to guide the global GP model to locate promising regions when the feedback information reaches a presetting threshold. Moreover, a local search strategy (DE/best/1) is used to guide the local GP model which is built by using individuals closest to the current best individual to intensively exploit the promising regions. Furthermore, a dimension reduction technique is used to construct a reasonably accurate GP model for high-dimensional expensive problems. Empirical studies on benchmark problems with 50 and 100 variables demonstrate that the proposed algorithm is able to find high-quality solutions for high-dimensional problems under a limited computational budget.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. El-Ela, A.A., Fetouh, T., Bishr, M., Saleh, R.: Power systems operation using particle swarm optimization technique. Electr. Power Syst. Res. 78(11), 1906–1913 (2008)

    Article  Google Scholar 

  2. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  3. Nguyen, S., Zhang, M., Johnston, M., Tan, K.C.: Automatic programming via iterated local search for dynamic job shop scheduling. IEEE Trans. Cybernet. 45(1), 1–14 (2015)

    Article  Google Scholar 

  4. Yoon, Y., Kim, Y.-H.: An efficient genetic algorithm for maximum coverage deployment in wireless sensor networks. IEEE Trans. Cybernet. 43(5), 1473–1483 (2013)

    Article  Google Scholar 

  5. Wu, T.-Y., Lin, C.-H.: Low-SAR path discovery by particle swarm optimization algorithm in wireless body area networks. IEEE Sens. J. 15(2), 928–936 (2015)

    Article  Google Scholar 

  6. He, S., Prempain, E., Wu, Q.: An improved particle swarm optimizer for mechanical design optimization problems. Eng. Optim. 36(5), 585–605 (2004)

    Article  MathSciNet  Google Scholar 

  7. Lim, D., Jin, Y., Ong, Y.-S., Sendhoff, B.: Generalizing surrogate-assisted evolutionary computation. IEEE Trans. Evol. Comput. 14(3), 329–355 (2010)

    Article  Google Scholar 

  8. Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. A Fusion Found. Methodol. Appl. 9(1), 3–12 (2005)

    Google Scholar 

  9. Gaspar-Cunha, A., Vieira, A.: A Hybrid Multi-objective evolutionary algorithm using an inverse neural network. In: Hybrid Metaheuristics, pp. 25–30 (2004)

  10. Gaspar-Cunha, A., Vieira, A.: A multi-objective evolutionary algorithm using neural networks to approximate fitness evaluations. Int. J. Comput. Syst. Signal 6(1), 18–36 (2005)

    Google Scholar 

  11. Lian, Y., Liou, M.-S.: Multiobjective optimization using coupled response surface model and evolutionary algorithm. AIAA J. 43(6), 1316–1325 (2005)

    Article  Google Scholar 

  12. Loshchilov, I., Schoenauer, M., Sebag, M.: A mono surrogate for multiobjective optimization. In: Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, pp. 471–478. ACM (2010)

  13. Herrera, M., Guglielmetti, A., Xiao, M., Coelho, R.F.: Metamodel-assisted optimization based on multiple kernel regression for mixed variables. Struct. Multidiscip. Optim. 49(6), 979–991 (2014)

    Article  Google Scholar 

  14. Isaacs, A., Ray, T., Smith, W.: An evolutionary algorithm with spatially distributed surrogates for multiobjective optimization. In: Australian Conference on Artificial Life, pp. 257–268. Springer (2007)

  15. Zapotecas Martínez, S., Coello Coello, C.A.: MOEA/D assisted by RBF networks for expensive multi-objective optimization problems. In: Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, pp. 1405–1412. ACM (2013)

  16. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  17. Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted S-metric selection. In: International Conference on Parallel Problem Solving from Nature, pp. 784–794. Springer (2008)

  18. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2010)

    Article  Google Scholar 

  19. Ahmed, M., Qin, N.: Surrogate-based multi-objective aerothermodynamic design optimization of hypersonic spiked bodies. AIAA J. 50(4), 797–810 (2012)

    Article  Google Scholar 

  20. Ratle, A.: Kriging as a surrogate fitness landscape in evolutionary optimization. AI EDAM 15(01), 37–49 (2001)

    Google Scholar 

  21. Jin, Y., Olhofer, M., Sendhoff, B.: A framework for evolutionary optimization with approximate fitness functions. IEEE Trans. Evol. Comput. 6(5), 481–494 (2002)

    Article  Google Scholar 

  22. Ulmer, H., Streichert, F., Zell, A.: Evolution strategies assisted by Gaussian processes with improved preselection criterion. In: Evolutionary Computation. CEC’03. The 2003 Congress on 2003, pp. 692–699. IEEE (2003)

  23. Karakasis, M., Giannakoglou, K.: On the use of metamodel-assisted, multi-objective evolutionary algorithms. Eng. Optim. 38(8), 941–957 (2006)

    Article  MathSciNet  Google Scholar 

  24. Parno, M.D., Fowler, K.R., Hemker, T.: Framework for particle swarm optimization with surrogate functions. Darmstadt Technical University, Darmstadt (2009)

    Google Scholar 

  25. Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evolut. Comput. 1(2), 61–70 (2011)

    Article  Google Scholar 

  26. Di Nuovo, A., Ascia, G., Catania, V.: A study on evolutionary multi-objective optimization with fuzzy approximation for computational expensive problems. In: Parallel Problem Solving from Nature-PPSN XII, pp. 102–111 (2012)

  27. Liu, B., Zhang, Q., Gielen, G.G.: A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Trans. Evol. Comput. 18(2), 180–192 (2014)

    Article  Google Scholar 

  28. Gong, W., Zhou, A., Cai, Z.: A multioperator search strategy based on cheap surrogate models for evolutionary optimization. IEEE Trans. Evol. Comput. 19(5), 746–758 (2015)

    Article  Google Scholar 

  29. Ong, Y.S., Nair, P.B., Keane, A.J.: Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J. 41(4), 687–696 (2003)

    Article  Google Scholar 

  30. Smith, R.E., Dike, B.A., Stegmann, S.: Fitness inheritance in genetic algorithms. In: Proceedings of the 1995 ACM Symposium on Applied Computing, pp. 345–350. ACM (1995)

  31. Hendtlass, T.: Fitness estimation and the particle swarm optimisation algorithm. In: Evolutionary Computation. CEC 2007. IEEE Congress on 2007, pp. 4266–4272. IEEE (2007)

  32. Sun, C., Zeng, J., Pan, J., Xue, S., Jin, Y.: A new fitness estimation strategy for particle swarm optimization. Inf. Sci. 221, 355–370 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  33. Zhou, Z., Ong, Y.S., Nguyen, M.H., Lim, D.: A study on polynomial regression and Gaussian process global surrogate model in hierarchical surrogate-assisted evolutionary algorithm. In: Evolutionary Computation. The 2005 IEEE Congress on 2005, pp. 2832–2839. IEEE (2005)

  34. Tenne, Y., Armfield, S.W.: A framework for memetic optimization using variable global and local surrogate models. Soft Comput. A Fusion Found. Methodol. Appl. 13(8), 781–793 (2009)

    Google Scholar 

  35. Müller, J., Shoemaker, C.A.: Influence of ensemble surrogate models and sampling strategy on the solution quality of algorithms for computationally expensive black-box global optimization problems. J. Glob. Optim. 60(2), 123–144 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  36. Sun, C., Jin, Y., Zeng, J., Yu, Y.: A two-layer surrogate-assisted particle swarm optimization algorithm. Soft. Comput. 19(6), 1461–1475 (2015)

    Article  Google Scholar 

  37. Bouhlel, M.A., Bartoli, N., Otsmane, A., Morlier, J.: Improving kriging surrogates of high-dimensional design models by Partial Least Squares dimension reduction. Struct. Multidiscip. Optim. 53(5), 935–952 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  38. Regis, R.G.: Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points. Eng. Optim. 46(2), 218–243 (2014)

    Article  MathSciNet  Google Scholar 

  39. Regis, R.G.: Evolutionary programming for high-dimensional constrained expensive black-box optimization using radial basis functions. IEEE Trans. Evol. Comput. 18(3), 326–347 (2014)

    Article  Google Scholar 

  40. Liu, B., Koziel, S., Zhang, Q.: A multi-fidelity surrogate-model-assisted evolutionary algorithm for computationally expensive optimization problems. J. Comput. Sci. 12, 28–37 (2016)

    Article  MathSciNet  Google Scholar 

  41. Jin, C., Qin, A.K., Tang, K.: Local ensemble surrogate assisted crowding differential evolution. In: Evolutionary Computation (CEC), IEEE Congress on 2015, pp. 433–440. IEEE (2015)

  42. Awad, N.H., Ali, M.Z., Mallipeddi, R., Suganthan, P.N.: An improved differential evolution algorithm using efficient adapted surrogate model for numerical optimization. Inf. Sci. 451, 326–347 (2018)

    Article  MathSciNet  Google Scholar 

  43. Elsayed, S.M., Ray, T., Sarker, R.A.: A surrogate-assisted differential evolution algorithm with dynamic parameters selection for solving expensive optimization problems. In: Evolutionary Computation (CEC), IEEE Congress on 2014, pp. 1062–1068. IEEE (2014)

  44. Mallipeddi, R., Lee, M.: An evolving surrogate model-based differential evolution algorithm. Appl. Soft Comput. 34, 770–787 (2015)

    Article  Google Scholar 

  45. Dennis, J., Torczon, V.: Managing approximation models in optimization. In: Multidisciplinary Design Optimization: State-of-the-Art, pp. 330–347 (1997)

  46. Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling: A Practical Guide. Wiley, New York (2008)

    Book  Google Scholar 

  47. Viana, F.A., Haftka, R.T., Watson, L.T.: Efficient global optimization algorithm assisted by multiple surrogate techniques. J. Glob. Optim. 56(2), 669–689 (2013)

    Article  MATH  Google Scholar 

  48. Rasmussen, C.E.: Gaussian processes in machine learning. In: Advanced Lectures on Machine Learning, pp. 63–71. Springer (2004)

  49. Lophaven, S.N., Nielsen, H.B., Søndergaard, J.: DACE-A Matlab Kriging toolbox, version 2.0. In. (2002)

  50. Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–423 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  51. Van Der Maaten, L., Postma, E., Van den Herik, J.: Dimensionality reduction: a comparative. J. Mach. Learn. Res. 10, 66–71 (2009)

    Google Scholar 

  52. Sammon, J.W.: A nonlinear mapping for data structure analysis. IEEE Trans. Comput. 100(5), 401–409 (1969)

    Article  Google Scholar 

  53. Vesanto, J., Himberg, J., Alhoniemi, E., Parhankangas, J.: SOM Toolbox for Matlab 5. Helsinki University of Technology, Espoo (2000)

    Google Scholar 

  54. Storn, R., Price, K.: Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  55. Zhang, J., Sanderson, A.C.: JADE: adaptive differential evolution with optional external archive. IEEE Trans. Evol. Comput. 13(5), 945–958 (2009)

    Article  Google Scholar 

  56. Price, K.V., Storn, R.M., Lampinen, J.A.: Differential Evolution—A Practical Approach to Global Optimization. Natural Computing Series. Springer, Berlin (2005)

    MATH  Google Scholar 

  57. Barbosa, H.J., Sá, A.: On adaptive operator probabilities in real coded genetic algorithms. In: XX International Conference of the Chilean Computer Science Society (2000)

  58. Thierens, D.: An adaptive pursuit strategy for allocating operator probabilities. In: Proceedings of the 7th Annual Conference on Genetic and Evolutionary Computation, pp. 1539–1546. ACM (2005)

  59. Gong, W., Fialho, Á., Cai, Z., Li, H.: Adaptive strategy selection in differential evolution for numerical optimization: an empirical study. Inf. Sci. 181(24), 5364–5386 (2011)

    Article  MathSciNet  Google Scholar 

  60. Liu, J., Lampinen, J.: A fuzzy adaptive differential evolution algorithm. In: TENCON’02. Proceedings. 2002 IEEE Region 10 Conference on Computers, Communications, Control and Power Engineering, pp. 606–611. IEEE (2002)

  61. Qin, A.K., Suganthan, P.N.: Self-adaptive differential evolution algorithm for numerical optimization. In: Evolutionary Computation. The 2005 IEEE Congress on 2005, pp. 1785–1791. IEEE (2005)

  62. Brest, J., Greiner, S., Boskovic, B., Mernik, M., Zumer, V.: Self-adapting control parameters in differential evolution: a comparative study on numerical benchmark problems. IEEE Trans. Evol. Comput. 10(6), 646–657 (2006)

    Article  Google Scholar 

  63. Regis, R.G., Shoemaker, C.A.: Improved strategies for radial basis function methods for global optimization. J. Glob. Optim. 37(1), 113–135 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  64. Regis, R.G., Shoemaker, C.A.: Constrained global optimization of expensive black box functions using radial basis functions. J. Glob. Optim. 31(1), 153–171 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  65. Holmström, K.: An adaptive radial basis algorithm (ARBF) for expensive black-box global optimization. J. Glob. Optim. 41(3), 447–464 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  66. Regis, R.G.: Stochastic radial basis function algorithms for large-scale optimization involving expensive black-box objective and constraint functions. Comput. Oper. Res. 38(5), 837–853 (2011)

    Article  MathSciNet  Google Scholar 

  67. Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)

    Article  MathSciNet  Google Scholar 

  68. Liang, J., Qu, B., Suganthan, P., Hernández-Díaz, A.G.: Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. In: Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou, China and Nanyang Technological University, Singapore, Technical Report 201212 (2013)

  69. Awad, N., Ali, M., Liang, J., Qu, B., Suganthan, P.: Problem Definitions and Evaluation Criteria for the CEC 2017 Special Session and Competition on Single Objective Real-Parameter Numerical Optimization (2016)

  70. Regis, R.G.: An initialization strategy for high-dimensional surrogate-based expensive black-box optimization. In: Modeling and Optimization: Theory and Applications, pp. 51–85. Springer (2013)

  71. Sun, C., Jin, Y., Cheng, R., Ding, J., Zeng, J.: Surrogate-assisted cooperative swarm optimization of high-dimensional expensive problems. IEEE Trans. Evol. Comput. 21(4), 644–660 (2017)

    Article  Google Scholar 

Download references

Acknowledgements

This research is supported by the National Natural Science Foundation of China under Grant Nos. 51675198, 51721092, the National Natural Science Foundation for Distinguished Young Scholars of China under Grant No. 51825502, and the Program for HUST Academic Frontier Youth Team.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Haobo Qiu.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, Z., Qiu, H., Gao, L. et al. Two-layer adaptive surrogate-assisted evolutionary algorithm for high-dimensional computationally expensive problems. J Glob Optim 74, 327–359 (2019). https://doi.org/10.1007/s10898-019-00759-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-019-00759-0

Keywords

Navigation