Skip to main content

Advertisement

Log in

Empirical study on meta-feature characterization for multi-objective optimization problems

  • S.I.: NCAA 2021
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Algorithm recommendation based on meta-learning was studied previously. The research on the meta-features extraction, which is a key for the success of recommendation, is lacking for multi-objective optimization problems (MOPs). This paper proposes four sets of meta-features to characterize MOPs. In addition, the algorithm recommendation model based on meta-learning is extended to the field of multi-objective optimization. To evaluate the efficiency and effectiveness of the extracted meta-features, 29 MOPs benchmark functions with different dimensions and two real-world MOPs are employed for comprehensive comparison. Experimental results show that the proposed meta-features in this paper can fully characterize MOPs and are empirically efficient for algorithm recommendation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Data availability

The MOEAs and MOPs used to support the findings are available from PlatEMO.

References

  1. Wang G, Song Q, Zhang X, Zhang K (2014) a generic multilabel learning-based classification algorithm recommendation method. Acm Trans Knowl Discov Data (TKDD) 9(1):1–30

    Article  Google Scholar 

  2. Wolpert DH (2002) The supervised learning no-free-lunch theorems. Soft computing and industry. Springer, London, pp 25–42

    Google Scholar 

  3. Pimentel BA, Carvalho ACDE (2019) A new data characterization for selecting clustering algorithms using meta-learning. Inf Sci 477:203–219

    Article  Google Scholar 

  4. Gutierrez-Rodriguez AE, Conant-Pablos SE, Ortiz-Bayliss JC, Terashima-Marin H (2019) Selecting meta-heuristics for solving vehicle routing problems with time windows via meta-learning. Expert Syst Appl 118:470–481

    Article  Google Scholar 

  5. Zeng ZL, Zhang HJ, Zhang R (2014) Summary of algorithm selection problem based on meta-learning. Control Dec 29(6):961–968

    MATH  Google Scholar 

  6. Chu XH, Cai FL, Cui C, Hu MQ, Li L, Qin QD (2019) Adaptive recommendation model using meta-learning for population-based algorithms. Inf Sci 476:192–210

    Article  Google Scholar 

  7. Alcobaça E, Siqueira F, Rivolli A, Garcia LPF, Oliva JT, de Carvalho AC (2020) MFE: towards reproducible meta-feature extraction. J Mach Learn Res 21:111:1-111:5

    MATH  Google Scholar 

  8. Filchenkov A, Pendryak A (2015) Datasets meta-feature description for recommending feature selection algorithm. In: 2015 Artificial intelligence and natural language and information extraction, social media and web search FRUCT conference (AINL-ISMW FRUCT). IEEE, pp 11–18

  9. Kanda J, de Carvalho A, Hruschka E, Soares C, Brazdil P (2016) Meta-learning to select the best meta-heuristic for the traveling salesman problem: a comparison of meta-features. Neurocomputing 205:393–406

    Article  Google Scholar 

  10. Parmezan ARS, Lee HD, Spolaôr N, Wu FC (2021) Automatic recommendation of feature selection algorithms based on dataset characteristics. Expert Syst Appl 185:115589

    Article  Google Scholar 

  11. Gu QH, Wang Q, Xiong NN, Jiang S, Chen L (2021) Surrogate-assisted evolutionary algorithm for expensive constrained multi-objective discrete optimization problems. Complex Intell Syst. https://doi.org/10.1007/s40747-020-00249-x

  12. Jin Y (2005) A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput 9(1):3–12

    Article  Google Scholar 

  13. Xu JJ, Jin YC, Du WL (2021) A federated data-driven evolutionary algorithm for expensive multi-/many-objective optimization. Complex Intell Syst 7:3093–3109

    Article  Google Scholar 

  14. Wang HD, Jin YC (2020) A random forest-assisted evolutionary algorithm for data-driven constrained multiobjective combinatorial optimization of trauma systems. IEEE Trans Cybern 50(2):536–549

    Article  Google Scholar 

  15. Lu H, Liu YX, Cheng S, Shi YH (2020) Adaptive online data-driven closed-loop parameter control strategy for swarm intelligence algorithm. Inf Sci 536:25–52

    Article  MathSciNet  MATH  Google Scholar 

  16. Chu X, Gao D, Chen J, Cui J, Cui C, Xu SX, Qin Q (2019) Adaptive differential search algorithm with multi-strategies for global optimization problems. Neural Comput Appl 31(12):8423–8440

    Article  Google Scholar 

  17. Chu X, Wu T, Weir JD, Shi Y, Niu B, Li L (2020) Learning–interaction–diversification framework for swarm intelligence optimizers: a unified perspective. Neural Comput Appl 32(6):1789–1809

    Article  Google Scholar 

  18. Rice JR (1976) The algorithm selection problem. In: Advances in computers. Elsevier, pp 65–118

  19. Cui C, Hu MQ, Weir JD, Wu T (2016) A Recommendation model for meta-modeling: a meta-learning based approach. Expert Syst Appl 46:33–44

    Article  Google Scholar 

  20. Khan I, Zhang XC, Rehman M, Ali R (2020) A literature survey and empirical study of meta-learning for classifier selection. IEEE Access 8:10262–10281

    Article  Google Scholar 

  21. Balte A, Pise N, Kulkarni P (2014) Meta-learning with landmarking: a survey. Int J Comput Appl 105(8):47–51

    Google Scholar 

  22. Castiello C, Castellano G, Fanelli AM (2005) Meta-data: characterization of input features for meta-learning. In: International conference on modeling decisions for artificial intelligence. Springer, Berlin, Heidelberg, pp 457–468

  23. Peng YH, Flach PA, Soares C, Brazdil P (2002) Improved dataset characterisation for meta-learning. In: International conference on discovery science. Springer, Berlin, Heidelberg, pp 141–152

  24. Kotlar M, Punt M, Radivojevic Z, Cvetanovic M, Milutinovic V (2021) Novel meta-features for automated machine learning model selection in anomaly detection. IEEE Access 9:89675–89687

    Article  Google Scholar 

  25. Lorena AC, Maciel AI, de Miranda PBC, Costa IG, Prudencio RBC (2018) Data complexity meta-features for regression problems. Mach Learn 107(1):209–246

    Article  MathSciNet  MATH  Google Scholar 

  26. Pimentel BA, de Carvalho AC (2020) A meta-learning approach for recommending the number of clusters for clustering algorithms. Knowl Based Syst 195:105682

    Article  Google Scholar 

  27. Pfahringer B, Bensusan H, Giraud-Carrier CG (2000) Meta-learning by landmarking various learning algorithms. In: ICML, pp 743–750

  28. Brazdil P, Carrier CG, Soares C, Vilalta R (2008) Metalearning: applications to data mining. Springer, Berlin

    MATH  Google Scholar 

  29. Zhang QF, Li H (2007) MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Trans Evol Comput 11(6):712–731

    Article  Google Scholar 

  30. Deb K, Agrawal S, Pratap A, Meyarivan T (2000) A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In: International conference on parallel problem solving from nature. Springer, Berlin, Heidelberg, pp 849–858

  31. Deb K, Jain H (2013) An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Trans Evol Comput 18(4):577–601

    Article  Google Scholar 

  32. Zitzler E, Kunzli S (2004) Indicator-based selection in multiobjective search. In: International conference on parallel problem solving from nature. Springer, Berlin, Heidelberg, pp 832–842

  33. Zitzler E, Laumanns M, Thiele L (2001) SPEA2: improving the strength Pareto evolutionary algorithm. TIK-report 103

  34. Zheng JH, Zou JH, Liu H (2017) Multi-objective evolutionary optimization. SARAP algorithm of multi-objective optimal capacity configuration for WT-PV-DE-BES Stand-Alone Microgrid

  35. Deb K (1999) Multi-objective genetic algorithms: problem difficulties and construction of test problems. Evol Comput 7(3):205–230

    Article  Google Scholar 

  36. Huband S, Hingston P, Barone L, While L (2006) A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans Evol Comput 10(5):477–506

    Article  MATH  Google Scholar 

  37. Singh VK, Maurya NS, Mani A, Yadav RS (2020) Machine learning method using position-specific mutation based classification outperforms one hot coding for disease severity prediction in haemophilia “A.” Genomics 112(6):5122–5128

    Article  Google Scholar 

  38. Ishibuchi H, Setoguchi Y, Masuda H, Nojima Y (2017) Performance of decomposition-based many-objective algorithms strongly depends on Pareto front shapes. IEEE Trans Evol Comput 21(2):169–190

    Article  Google Scholar 

  39. Deb K, Thiele L, Laumanns M, Zitzler E (2005) Scalable test problems for evolutionary multi-objective optimization. In: Evolutionary multiobjective optimization. Springer, London, pp 105–145

  40. Zitzler E, Deb K, Thiele L (2000) Comparison of multiobjective evolutionary algorithms: empirical results. Evol Comput 8(2):173–195

    Article  Google Scholar 

  41. Cheng R, Jin YC, Olhofer M, Sendhoff B (2017) Test problems for large-scale multiobjective and many-objective optimization. IEEE Trans Cybern 47(12):4108–4121

    Article  Google Scholar 

  42. Zapotecas-Martinez S, Coello CAC, Aguirre HE, Tanaka K (2019) A review of features and limitations of existing scalable multiobjective test suites. IEEE Trans Evol Comput 23(1):130–142

    Article  Google Scholar 

  43. Smith-Miles KA (2008) Cross-disciplinary perspectives on meta-learning for algorithm selection. ACM Comput Surv (CSUR) 41(1):1–25

    Article  Google Scholar 

  44. Tian Y, Cheng R, Zhang XY, Jin YC (2017) PlatEMO: a MATLAB platform for evolutionary multi-objective optimization. IEEE Comput Intell Mag 12(4):73–87

    Article  Google Scholar 

  45. Wanka G (1999) Multiobjective duality for the Markowitz portfolio optimization problem. Control Cybern 28(4):691–702

    MathSciNet  MATH  Google Scholar 

  46. Armananzas R, Lozano JA (2005) A multiobjective approach to the portfolio optimization problem. In: 2005 IEEE congress on evolutionary computation. IEEE, pp 1388–1395

  47. Lwin K, Qu R, Kendall G (2014) A learning-guided multi-objective evolutionary algorithm for constrained portfolio optimization. Appl Soft Comput 24:757–772

    Article  Google Scholar 

  48. Figueiredo MAT, Nowak RD, Wright SJ (2007) Gradient projection for sparse reconstruction: application to compressed sensing and other inverse problems. IEEE J Sel Top Signal Process 1(4):586–597

    Article  Google Scholar 

  49. Li H, Zhang QF, Deng JD, Xu ZB (2018) A preference-based multiobjective evolutionary approach for sparse optimization. IEEE Trans Neural Netw Learn Syst 29(5):1716–1731

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work was partially supported by the National Natural Science Foundation of China (Grant No. 71971142), the Natural Science Foundation of Guangdong Province (No. 2022A1515010278, 2021A1515110595 and 2016A030310067), the Major Research plan of the National Natural Science Foundation of China (No. 91846301), and the Major Project for National Natural Science Foundation of China (No. 71790615).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Shuxiang Li or Yujuan Chai.

Ethics declarations

Conflict of interest

The authors declare that there are no conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chu, X., Wang, J., Li, S. et al. Empirical study on meta-feature characterization for multi-objective optimization problems. Neural Comput & Applic 34, 16255–16273 (2022). https://doi.org/10.1007/s00521-022-07302-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-022-07302-5

Keywords