Skip to main content

Meta-feature Extraction for Multi-objective Optimization Problems

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1449))

Abstract

Selecting the appropriate meta-features to represent the optimization problems was studied previously. However, the research on the extraction of meta-features for multi-objective problems is lacking. In this paper, a set of meta-features including a unique meta-feature based on Pareto front shape and the combination of meta-features are proposed for the multi-objective optimization problems (MOPs). 25 multi-objective benchmark functions and K-NN algorithm are adopted to realize the algorithm recommendation for MOPs. Experimental results show that the meta-features based on Pareto front can properly represent multi-objective problems and obtain better recommendation performance. The algorithm recommendation accuracy is improved once the combination of meta-features is considered.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Li, K., Wang, R., Zhang, T.: Evolutionary many-objective optimization: a comparative study of the state-of-the-art. IEEE Access 6, 26194–26214 (2018)

    Article  Google Scholar 

  2. Ma, X., Yang, J., Wu, N.: A comparative study on decomposition-based multi-objective evolutionary algorithms for many-objective optimization. In: 2016 IEEE Congress on Evolutionary Computation (CEC), pp. 2477–2483 (2016)

    Google Scholar 

  3. Wang, R., Zhou, Z., Ishibuchi, H.: Localized weighted sum method for many-objective optimization. IEEE Trans. Evol. Comput. 22(1), 3–18 (2018)

    Article  Google Scholar 

  4. Tanabe, R., Ishibuchi, H.: A review of evolutionary multimodal multiobjective optimization. IEEE Trans. Evol. Comput. 24(1), 193–200 (2020)

    Article  Google Scholar 

  5. Huband, S., Hingston, P., Barone, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10(5), 477–506 (2006)

    Article  Google Scholar 

  6. Zitzler, E., Thiele, L.: An evolutionary algorithm for multiobjective optimization: the strength pareto approach. TIK-report, vol. 43 (1998)

    Google Scholar 

  7. Zhang, Q., Li, H.: MOEA/D: a multiobjective evolutionary algorithm based on decomposition. IEEE Trans. Evol. Comput. 11(6), 712–731 (2007)

    Article  Google Scholar 

  8. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2013)

    Article  Google Scholar 

  9. Deb, K., Pratap, A., Agarwal, S.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  10. Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: improving the strength Pareto evolutionary algorithm. TIK-report, vol. 103 (2001)

    Google Scholar 

  11. Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 832–842. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30217-9_84

    Chapter  Google Scholar 

  12. Chu, X., Cai, F., Cui, C.: Adaptive recommendation model using meta-learning for population-based algorithms. Inf. Sci. 476, 192–210 (2019)

    Article  Google Scholar 

  13. Kanda, J., De Carvalho, A., Hruschka, E.: Meta-learning to select the best meta-heuristic for the traveling salesman problem: a comparison of meta-features. Neurocomputing 205, 393–406 (2016)

    Article  Google Scholar 

  14. Brazdil, P., Carrier, C.G., Soares, C.: Metalearning: Applications to Data Mining. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-73263-1

    Book  MATH  Google Scholar 

  15. Peng, Y., Flach, P.A., Soares, C., Brazdil, P.: Improved dataset characterisation for meta-learning. In: Lange, S., Satoh, K., Smith, C.H. (eds.) DS 2002. LNCS, vol. 2534, pp. 141–152. Springer, Heidelberg (2002). https://doi.org/10.1007/3-540-36182-0_14

    Chapter  Google Scholar 

  16. Vilalta, R., Giraud-Carrier, C.G., Brazdil, P.: Using meta-learning to support data mining. Int. J. Comput. Sci. Appl. 1(1), 31–45 (2004)

    Google Scholar 

  17. Alcobaca, E., Siqueira, F., Rivolli, A.: MFE: towards reproducible meta-feature extraction. J. Mach. Learn. Res. 21, 1–5 (2020)

    MATH  Google Scholar 

  18. Rivolli, A., Garcia, L. P., Soares, C.: Towards reproducible empirical research in meta-learning, pp. 32–52 (2018)

    Google Scholar 

  19. Reif, M., Shafait, F., Goldstein, M., Breuel, T., Dengel, A.: Automatic classifier selection for non-experts. Pattern Anal. Appl. 17(1), 83–96 (2012). https://doi.org/10.1007/s10044-012-0280-z

    Article  MathSciNet  Google Scholar 

  20. Balte, A., Pise, N., Kulkarni, P.: Meta-learning with landmarking: a survey. Int. J. Comput. Appl. 105, 8 (2014)

    Google Scholar 

  21. Pinto, F., Soares, C., Mendes-Moreira, J.: Towards automatic generation of metafeatures. In: Bailey, J., Khan, L., Washio, T., Dobbie, G., Huang, J.Z., Wang, R. (eds.) PAKDD 2016. LNCS (LNAI), vol. 9651, pp. 215–226. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-31753-3_18

    Chapter  Google Scholar 

  22. Bossek, J.: smoof: single-and multi-objective optimization test functions. R J. 9(1), 103 (2017)

    Article  Google Scholar 

  23. Ishibuchi, H., Setoguchi, Y., Masuda, H.: Performance of decomposition-based many-objective algorithms strongly depends on pareto front shapes. IEEE Trans. Evol. Comput. 21(2), 169–190 (2017)

    Article  Google Scholar 

  24. Cui, C., Hu, M.Q., Weir, J.D.: A recommendation system for meta-modeling: a meta-learning based approach. Expert Syst. Appl. 46, 33–44 (2016)

    Article  Google Scholar 

  25. Tian, Y., Cheng, R., Zhang, X.: PlatEMO: a MATLAB platform for evolutionary multi-objective optimization [educational forum]. IEEE Comput. Intell. Mag. 12(4), 73–87 (2017)

    Google Scholar 

  26. Neave, H., Worthington, P.: Distribution-free tests. Contemp. Sociol. 19(3), 137–153 (1990)

    Google Scholar 

Download references

Acknowledgements

This work was partially supported by the National Natural Science Foundation of China (Grant No. 71971142 and 71701079).

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chu, X. et al. (2021). Meta-feature Extraction for Multi-objective Optimization Problems. In: Zhang, H., Yang, Z., Zhang, Z., Wu, Z., Hao, T. (eds) Neural Computing for Advanced Applications. NCAA 2021. Communications in Computer and Information Science, vol 1449. Springer, Singapore. https://doi.org/10.1007/978-981-16-5188-5_31

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-5188-5_31

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-5187-8

  • Online ISBN: 978-981-16-5188-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics