Skip to main content

A Systematic Approach to Analyze the Computational Cost of Robustness in Model-Assisted Robust Optimization

  • Conference paper
  • First Online:
  • 1487 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13398))

Abstract

Real-world optimization scenarios under uncertainty and no-ise are typically handled with robust optimization techniques, which re-formulate the original optimization problem into a robust counterpart, e.g., by taking an average of the function values over different perturbations to a specific input. Solving the robust counterpart instead of the original problem can significantly increase the associated computational cost, which is often overlooked in the literature to the best of our knowledge. Such an extra cost brought by robust optimization might depend on the problem landscape, the dimensionality, the severity of the uncertainty, and the formulation of the robust counterpart. This paper targets an empirical approach that evaluates and compares the computational cost brought by different robustness formulations in Kriging-based optimization on a wide combination (300 test cases) of problems, uncertainty levels, and dimensions. We mainly focus on the CPU time taken to find the robust solutions, and choose five commonly-applied robustness formulations: “mini-max robustness”, “mini-max regret robustness”, “expectation-based robustness”, “dispersion-based robustness”, and “composite robustness” respectively. We assess the empirical performance of these robustness formulations in terms of a fixed budget and a fixed target analysis, from which we find that “mini-max robustness” is the most practical formulation w.r.t. the associated computational cost.

This research has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement number 766186.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    An underlying assumption in this study is the non-existence of hard constraints on the choice of RF. In some practical scenarios of RO, this assumption is not valid. Note, however, that, there are plenty of situations where the assumption is valid, and the lack of information on the computational aspects of the RFs makes it harder for practitioners to choose a suitable robustness criterion.

  2. 2.

    The robust or effective function response has already been defined in Sect. 2 for five of the most common RFs.

  3. 3.

    The source code to reproduce the experimental setup and results is available at: https://github.com/SibghatUllah13/UllahPPSN2022.

References

  1. Bal, H., et al.: A medium-scale distributed system for computer science research: infrastructure for the long term. Computer 49(5), 54–63 (2016)

    Article  Google Scholar 

  2. Ben-Tal, A., Ghaoui, L.E., Nemirovski, A.: Robust Optimization, Princeton Series in Applied Mathematics, vol. 28. Princeton University Press (2009)

    Google Scholar 

  3. Beyer, H.G.: Evolutionary algorithms in noisy environments: Theoretical issues and guidelines for practice. Comput. Methods Appl. Mech. Eng. 186(2–4), 239–267 (2000)

    Article  Google Scholar 

  4. Beyer, H.G., Sendhoff, B.: Robust optimization-a comprehensive survey. Comput. Methods Appl. Mech. Eng. 196(33–34), 3190–3218 (2007)

    Article  MathSciNet  Google Scholar 

  5. Genton, M.G.: Classes of kernels for machine learning: a statistics perspective. J. Mach. Learn. Res. 2, 299–312 (2001)

    MathSciNet  MATH  Google Scholar 

  6. Hansen, N., Auger, A., Mersmann, O., Tusar, T., Brockhoff, D.: COCO: a platform for comparing continuous optimizers in a black-box setting. CoRR abs/1603.08785 (2016)

    Google Scholar 

  7. Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)

    Article  MathSciNet  Google Scholar 

  8. Jurecka, F.: Robust design optimization based on metamodeling techniques. Ph.D. thesis, Technische Universität München (2007)

    Google Scholar 

  9. Kruisselbrink, J.W.: Evolution strategies for robust optimization. Leiden Institute of Advanced Computer Science (LIACS), Faculty of Science (2012)

    Google Scholar 

  10. Pedregosa, F., et al.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)

    MathSciNet  MATH  Google Scholar 

  11. ur Rehman, S., Langelaar, M., van Keulen, F.: Efficient kriging-based robust optimization of unconstrained problems. J. Comput. Sci. 5(6), 872–881 (2014)

    Google Scholar 

  12. Stein, M.: Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2), 143–151 (1987)

    Article  MathSciNet  Google Scholar 

  13. Taguchi, G., Konishi, S.: Taguchi Methods: Orthogonal Arrays and Linear Graphs. Tools for Quality Engineering. American Supplier Institute Dearborn, MI (1987)

    Google Scholar 

  14. Taguchi, G., Phadke, M.S.: Quality engineering through design optimization. In: Dehnad, K. (eds.) Quality Control, Robust Design, and the Taguchi Method, pp. 77–96. Springer, Boston (1989). https://doi.org/10.1007/978-1-4684-1472-1_5

  15. Ullah, S., Nguyen, D.A., Wang, H., Menzel, S., Sendhoff, B., Bäck, T.: Exploring dimensionality reduction techniques for efficient surrogate-assisted optimization. In: 2020 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 2965–2974. IEEE (2020)

    Google Scholar 

  16. Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Back, T.: An empirical comparison of meta-modeling techniques for robust design optimization. In: 2019 IEEE Symposium Series on Computational Intelligence (SSCI), pp. 819–828. IEEE (2019)

    Google Scholar 

  17. Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Bäck, T.: A new acquisition function for robust bayesian optimization of unconstrained problems. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1344–1345 (2021)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sibghat Ullah .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ullah, S., Wang, H., Menzel, S., Sendhoff, B., Bäck, T. (2022). A Systematic Approach to Analyze the Computational Cost of Robustness in Model-Assisted Robust Optimization. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tušar, T. (eds) Parallel Problem Solving from Nature – PPSN XVII. PPSN 2022. Lecture Notes in Computer Science, vol 13398. Springer, Cham. https://doi.org/10.1007/978-3-031-14714-2_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-14714-2_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-14713-5

  • Online ISBN: 978-3-031-14714-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics