Abstract
We consider the application of machine learning techniques to gain insights into the effect of problem features on algorithm performance, and to automate the task of algorithm selection for distance-based multi- and many-objective optimisation problems. This is the most extensive benchmark study of such problems to date. The problem features can be set directly by the problem generator, and include e.g. the number of variables, objectives, local fronts, and disconnected Pareto sets. Using 945 problem configurations (leading to \(28\,350\) instances) of varying complexity, we find that the problem features and the available optimisation budget (i) affect the considered algorithms (NSGA-II, IBEA, MOEA/D, and random search) in different ways and that (ii) it is possible to recommend a relevant algorithm based on problem features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Available in Matlab (https://github.com/fieldsend/DBMOPP_generator), and in Python (https://github.com/industrial-optimization-group/desdeo-problem/tree/master/desdeo_problem/testproblems/DBMOPP).
- 2.
\(1\,000\) members drawn from the Pareto front plus all non-dominated points found by the union of the algorithms’ approximation sets for each instance. The reference point for hypervolume was \(1.1 \times \text {maximum of objective values on the Pareto front}\) and estimated via Monte Carlo [7] with \(50\,000\) samples for \(4+\) objectives.
References
Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001). https://doi.org/10.1023/a:1010933404324
Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees. Taylor & Francis, Andover (1984)
Brockhoff, D., Tusar, T., Auger, A., Hansen, N.: Using well-understood single-objective functions in multiobjective black-box optimization test suites. arXiv CoRR (2016). https://doi.org/10.48550/arxiv.1604.00359
Carnell, R.: LHS: Latin hypercube samples (2022). r package version 1.1.5
Deb, K., Prarap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6, 182–197 (2002)
Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable test problems for evolutionary multiobjective optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization: Theoretical Advances and Applications, pp. 105–145. Springer, London (2005). https://doi.org/10.1007/1-84628-137-7_6
Everson, R.M., Fieldsend, J.E., Singh, S.: Full elite sets for multi-objective optimisation. In: Parmee, I.C. (ed.) Adaptive Computing in Design and Manufacture V, pp. 343–354. Springer, London (2002). https://doi.org/10.1007/978-0-85729-345-9_29
Fieldsend, J.E., Alyahya, K.: Visualising the landscape of multi-objective problems using local optima networks. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion, pp. 1421–1429 (2019)
Fieldsend, J.E., Chugh, T., Allmendinger, R., Miettinen, K.: A feature rich distance-based many-objective visualisable test problem generator. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 541–549 (2019)
Fieldsend, J.E., Chugh, T., Allmendinger, R., Miettinen, K.: A visualizable test problem generator for many-objective optimization. IEEE Trans. Evol. Comput. 26(1), 1–11 (2022)
Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)
Kerschke, P., Hoos, H.H., Neumann, F., Trautmann, H.: Automated algorithm selection: survey and perspectives. Evol. Comput. 27(1), 3–45 (2019)
Köppen, M., Vicente-Garcia, R., Nickolay, B.: Fuzzy-pareto-dominance and its application in evolutionary multi-objective optimization. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 399–412. Springer, Heidelberg (2005). https://doi.org/10.1007/978-3-540-31880-4_28
Köppen, M., Yoshida, K.: Substitute distance assignments in NSGA-II for handling many-objective optimization problems. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 727–741. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-70928-2_55
Kuhn, M.: Building predictive models in R using the caret package. J. Stat. Softw. Art. 28(5), 1–26 (2008)
Liaw, A., Wiener, M.: Classification and regression by randomforest. R News 2(3), 18–22 (2002)
Liefooghe, A., Daolio, F., Verel, S., Derbel, B., Aguirre, H., Tanaka, K.: Landscape-aware performance prediction for evolutionary multiobjective optimization. IEEE Trans. Evol. Comput. 24(6), 1063–1077 (2019)
Liefooghe, A., Derbel, B., Verel, S., López-Ibáñez, M., Aguirre, H., Tanaka, K.: On pareto local optimal solutions networks. In: Auger, A., Fonseca, C.M., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds.) PPSN 2018. LNCS, vol. 11102, pp. 232–244. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99259-4_19
Liefooghe, A., Verel, S., Lacroix, B., Zăvoianu, A.C., McCall, J.: Landscape features and automated algorithm selection for multi-objective interpolated continuous optimisation problems. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 421–429 (2021)
Malan, K.M.: A survey of advances in landscape analysis for optimisation. Algorithms 14(2), 40 (2021)
Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 829–836 (2011)
Muñoz, M.A., Villanova, L., Baatar, D., Smith-Miles, K.: Instance spaces for machine learning classification. Mach. Learn. 107(1), 109–147 (2018). https://doi.org/10.1007/s10994-017-5629-5
R Core Team: R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria (2020)
Rice, J.R.: The algorithm selection problem. In: Advances in Computers, vol. 15, pp. 65–118. Elsevier (1976)
Shand, C., Allmendinger, R., Handl, J., Webb, A., Keane, J.: HAWKS: Evolving challenging benchmark sets for cluster analysis. IEEE Trans. Evol. Comput. 26(6), 1206–1220 (2022)
Smith-Miles, K., Lopes, L.: Measuring instance difficulty for combinatorial optimization problems. Compute. Oper. Res. 39(5), 875–889 (2012)
Smith-Miles, K.A.: Cross-disciplinary perspectives on meta-learning for algorithm selection. ACM Comput. Surv. 41(1), 1–25 (2009)
Therneau, T., Atkinson, B.: rpart: Recursive partitioning and regression trees (2022). R package version 4.1.16
Wickham, H.: ggplot2: Elegant Graphics for Data Analysis. Springer, New York (2016)
Zăvoianu, A.-C., Lacroix, B., McCall, J.: Comparative run-time performance of evolutionary algorithms on multi-objective interpolated continuous optimisation problems. In: Bäck, T., et al. (eds.) PPSN 2020. LNCS, vol. 12269, pp. 287–300. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_20
Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 11, 712–731 (2007)
Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 832–842. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30217-9_84
Acknowledgements
This research is part of the thematic research area DEMO, Decision Analytics utilising Causal Models and Multiobjective Optimisation, jyu.fi/demo, at the University of Jyvaskyla.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Liefooghe, A., Verel, S., Chugh, T., Fieldsend, J., Allmendinger, R., Miettinen, K. (2023). Feature-Based Benchmarking of Distance-Based Multi/Many-objective Optimisation Problems: A Machine Learning Perspective. In: Emmerich, M., et al. Evolutionary Multi-Criterion Optimization. EMO 2023. Lecture Notes in Computer Science, vol 13970. Springer, Cham. https://doi.org/10.1007/978-3-031-27250-9_19
Download citation
DOI: https://doi.org/10.1007/978-3-031-27250-9_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-27249-3
Online ISBN: 978-3-031-27250-9
eBook Packages: Computer ScienceComputer Science (R0)