Skip to main content

Dimension Dropout for Evolutionary High-Dimensional Expensive Multiobjective Optimization

  • Conference paper
  • First Online:
Book cover Evolutionary Multi-Criterion Optimization (EMO 2021)

Abstract

In the past decades, a number of surrogate-assisted evolutionary algorithms (SAEAs) have been developed to solve expensive multiobjective optimization problems (EMOPs). However, most existing SAEAs focus on low-dimensional optimization problems, since a large number of training samples are required (which is unrealistic for EMOPs) to build an accurate surrogate model for high-dimensional problems. In this paper, an SAEA with Dimension Dropout is proposed to solve high-dimensional EMOPs. At each iteration of the proposed algorithm, it randomly selects a part of the decision variables by Dimension Dropout, and then optimizes the selected decision variables with the assistance of surrogate models. To balance the convergence and diversity, those candidate solutions with good diversity are modified by replacing the selected decision variables with those optimized ones (i.e., decision variables from some better-converged candidate solutions). Eventually, the new candidate solutions are evaluated using expensive functions to update the archive. Empirical studies on ten benchmark problems with up to 200 decision variables demonstrate the competitiveness of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abido, M.A.: Multiobjective evolutionary algorithms for electric power dispatch problem. IEEE Trans. Evol. Comput. 10(3), 315–329 (2006)

    Article  Google Scholar 

  2. Akhtar, T., Shoemaker, C.A.: Multi objective optimization of computationally expensive multi-modal functions with RBF surrogates and multi-rule selection. J. Glob. Optim. 64(1), 17–32 (2016)

    Article  MathSciNet  Google Scholar 

  3. Cheng, R., Jin, Y., Narukawa, K., Sendhoff, B.: A multiobjective evolutionary algorithm using gaussian process-based inverse modeling. IEEE Trans. Evol. Comput. 19(6), 838–856 (2015)

    Article  Google Scholar 

  4. Cheng, R., Jin, Y., Olhofer, M., Sendhoff, B.: A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 20(5), 773–791 (2016)

    Article  Google Scholar 

  5. Chugh, T., Jin, Y., Miettinen, K., Hakanen, J., Sindhya, K.: A surrogate-assisted reference vector guided evolutionary algorithm for computationally expensive many-objective optimization. IEEE Trans. Evol. Comput. 22(1), 129–142 (2018)

    Article  Google Scholar 

  6. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  7. Er, M., Wu, S., Lu, J., Toh, H.: Face recognition with radial basis function (RBF) neural networks. IEEE Trans. Neural Netw. 13(3), 697–710 (2002)

    Article  Google Scholar 

  8. He, C., Cheng, R., Danial, Y.: Adaptive offspring generation for evolutionary large-scale multiobjective optimization. IEEE Trans. Syst. Man Cybern. Syst. (2020). https://doi.org/10.1109/TSMC.2020.3003926

  9. He, C., Cheng, R., Jin, Y., Yao, X.: Surrogate-assisted expensive many-objective optimization by model fusion. In: 2019 IEEE Congress on Evolutionary Computation (CEC), pp. 1672–1679. IEEE (2019)

    Google Scholar 

  10. He, C., Cheng, R., Zhang, C., Tian, Y., Chen, Q., Yao, X.: Evolutionary large-scale multiobjective optimization for ratio error estimation of voltage transformers. IEEE Trans. Evol. Comput. 24(5), 868–881 (2020). https://doi.org/10.1109/TEVC.2020.2967501

    Article  Google Scholar 

  11. He, C., Huang, S., Cheng, R., Tan, K.C., Jin, Y.: Evolutionary multiobjective optimization driven by generative adversarial networks (GANs). IEEE Trans. Cybern. (2020). https://doi.org/10.1109/TCYB.2020.2985081

    Article  Google Scholar 

  12. He, C., Tian, Y., Jin, Y., Zhang, X., Pan, L.: A radial space division based many-objective optimization evolutionary algorithm. Appl. Soft Comput. 61, 603–621 (2017). https://doi.org/10.1016/j.asoc.2017.08.024

    Article  Google Scholar 

  13. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.R.: Improving neural networks by preventing co-adaptation of feature detectors. Comput. Sci. 3(4), 212–223 (2012)

    Google Scholar 

  14. Hussein, R., Deb, K.: A generative kriging surrogate model for constrained and unconstrained multi-objective optimization. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, pp. 573–580 (2016)

    Google Scholar 

  15. Kattan, A., Galvan, E.: Evolving radial basis function networks via GP for estimating fitness values using surrogate models. In: 2012 IEEE Congress on Evolutionary Computation, pp. 1–7 (2012)

    Google Scholar 

  16. Kim, J., Kim, Y., Choi, S., Park, I.: Evolutionary multi-objective optimization in robot soccer system for education. IEEE Comput. Intell. Mag. 4(1), 31–41 (2009)

    Article  Google Scholar 

  17. Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)

    Article  Google Scholar 

  18. Li, C., Gupta, S., Rana, S., Nguyen, V., Venkatesh, S., Shilton, A.: High dimensional Bayesian optimization using dropout. In: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, pp. 2096–2102 (2017)

    Google Scholar 

  19. Liu, B., Zhang, Q., Gielen, G.G.E.: A Gaussian process surrogate model assisted evolutionary algorithm for medium scale expensive optimization problems. IEEE Trans. Evol. Comput. 18(2), 180–192 (2014)

    Article  Google Scholar 

  20. Ma, M., Li, H., Huang, J.: A multi-objective evolutionary algorithm based on principal component analysis and grid division. In: 2018 14th International Conference on Computational Intelligence and Security (CIS), pp. 201–204 (2018)

    Google Scholar 

  21. Pan, L., He, C., Tian, Y., Su, Y., Zhang, X.: A region division based diversity maintaining approach for many-objective optimization. Integr. Comput. Aided Eng. 24(3), 279–296 (2017)

    Article  Google Scholar 

  22. Pan, L., He, C., Tian, Y., Wang, H., Zhang, X., Jin, Y.: A classification-based surrogate-assisted evolutionary algorithm for expensive many-objective optimization. IEEE Trans. Evol. Comput. 23(1), 74–88 (2019)

    Article  Google Scholar 

  23. Regis, R.G.: Evolutionary programming for high-dimensional constrained expensive black-box optimization using radial basis functions. IEEE Trans. Evol. Comput. 18(3), 326–347 (2014)

    Article  Google Scholar 

  24. Stein, M.: Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2), 143–151 (1987)

    Article  MathSciNet  Google Scholar 

  25. Storn, R., Price, K.: Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)

    Article  MathSciNet  Google Scholar 

  26. Sun, C., Jin, Y., Cheng, R., Ding, J., Zeng, J.: Surrogate-assisted cooperative swarm optimization of high-dimensional expensive problems. IEEE Trans. Evol. Comput. 21(4), 644–660 (2017)

    Article  Google Scholar 

  27. Sun, G., Pang, T., Fang, J., Li, G., Li, Q.: Parameterization of criss-cross configurations for multiobjective crashworthiness optimization. Int. J. Mech. Sci. 124, 145–157 (2017)

    Article  Google Scholar 

  28. Tian, Y., Cheng, R., Zhang, X., Jin, Y.: PlatEMO: a matlab platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)

    Article  Google Scholar 

  29. Wang, H., Rahnamayan, S., Wu, Z.: Parallel differential evolution with self-adapting control parameters and generalized opposition-based learning for solving high-dimensional optimization problems. J. Parallel Distrib. Comput. 73(1), 62–73 (2013)

    Article  Google Scholar 

  30. Wang, H., Wu, Z., Rahnamayan, S.: Enhanced opposition-based differential evolution for solving high-dimensional continuous optimization problems. Soft Comput. 15(11), 2127–2140 (2011)

    Article  Google Scholar 

  31. Wang, Y., Yin, D., Yang, S., Sun, G.: Global and local surrogate-assisted differential evolution for expensive constrained optimization problems with inequality constraints. IEEE Trans. Cybern. 49(5), 1642–1656 (2019)

    Article  Google Scholar 

  32. Yen, G.G., He, Z.: Performance metric ensemble for multiobjective evolutionary algorithms. IEEE Trans. Evol. Comput. 18(1), 131–144 (2014)

    Article  Google Scholar 

  33. Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2010)

    Article  Google Scholar 

  34. Zhou, Z., Ong, Y., Nguyen, M., Lim, D.: A study on polynomial regression and gaussian process global surrogate model in hierarchical surrogate-assisted evolutionary algorithm. In: 2005 IEEE Congress on Evolutionary Computation, vol. 3, pp. 2832–2839 (2005)

    Google Scholar 

Download references

Acknowledgment

This work was supported in part by the National Natural Science Foundation of China under Grant 61903178 and Grant 61906081; in part by the Program for Guangdong Introducing Innovative and Entrepreneurial Teams under Grant 2017ZT07X386; in part by the Shenzhen Peacock Plan under Grant KQTD2016112514355531; and in part by the Program for University Key Laboratory of Guangdong Province under Grant 2017KSYS008.

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lin, J., He, C., Cheng, R. (2021). Dimension Dropout for Evolutionary High-Dimensional Expensive Multiobjective Optimization. In: Ishibuchi, H., et al. Evolutionary Multi-Criterion Optimization. EMO 2021. Lecture Notes in Computer Science(), vol 12654. Springer, Cham. https://doi.org/10.1007/978-3-030-72062-9_45

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-72062-9_45

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-72061-2

  • Online ISBN: 978-3-030-72062-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics