Skip to main content

The Utilities of Evolutionary Multiobjective Optimization for Neural Architecture Search – An Empirical Perspective

  • Conference paper
  • First Online:
Bio-Inspired Computing: Theories and Applications (BIC-TA 2022)

Abstract

Evolutionary algorithms have been widely used in neural architecture search (NAS) in recent years due to their flexible frameworks and promising performance. However, we noticed a lack of attention to algorithm selection, and single-objective algorithms were preferred despite the multiobjective nature of NAS, among prior arts. To explore the reasons behind this preference, we tested mainstream evolutionary algorithms on several standard NAS benchmarks, comparing single and multi-objective algorithms. Additionally, we validated whether the latest evolutionary multi-objective optimization (EMO) algorithms lead to improvement in NAS problems compared to classical EMO algorithms. Our experimental results provide empirical answers to these questions and guidance for the future development of evolutionary NAS algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bader, J., Zitzler, E.: HypE: an algorithm for fast hypervolume-based many-objective optimization. Evol. Comput. 19(1), 45–76 (2011). https://doi.org/10.1162/EVCO_a_00009

  2. Blank, J., Deb, K., Dhebar, Y., Bandaru, S., Seada, H.: Generating well-spaced points on a unit simplex for evolutionary many-objective optimization. IEEE Trans. Evol. Comput. 25(1), 48–60 (2021). https://doi.org/10.1109/TEVC.2020.2992387

    Article  Google Scholar 

  3. Brown, T.B., et al.: Language models are few-shot learners. CoRR abs/2005.14165 (2020). https://arxiv.org/abs/2005.14165

  4. Cantú, V.H., Azzaro-Pantel, C., Ponsich, A.: Multi-objective evolutionary algorithm based on decomposition (MOEA/D) for optimal design of hydrogen supply chains. In: Pierucci, S., Manenti, F., Bozzano, G.L., Manca, D. (eds.) 30th European Symposium on Computer Aided Process Engineering, Computer Aided Chemical Engineering, vol. 48, pp. 883–888. Elsevier (2020). https://doi.org/10.1016/B978-0-12-823377-1.50148-8,https://www.sciencedirect.com/science/article/pii/B9780128233771501488

  5. Cheng, R., Jin, Y., Olhofer, M., Sendhoff, B.: A reference vector guided evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 20(5), 773–791 (2016). https://doi.org/10.1109/TEVC.2016.2519378

    Article  Google Scholar 

  6. Coello Coello, C.A., Reyes Sierra, M.: A study of the parallelization of a coevolutionary multi-objective evolutionary algorithm. In: Monroy, R., Arroyo-Figueroa, G., Sucar, L.E., Sossa, H. (eds.) MICAI 2004. LNCS (LNAI), vol. 2972, pp. 688–697. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-24694-7_71

    Chapter  Google Scholar 

  7. Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002). https://doi.org/10.1109/4235.996017

    Article  Google Scholar 

  8. Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part i: Solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014). https://doi.org/10.1109/TEVC.2013.2281535

    Article  Google Scholar 

  9. Devlin, J., Chang, M.W., Lee, K., Toutanova, K.: BERT: pre-training of deep bidirectional transformers for language understanding (2018). https://doi.org/10.48550/ARXIV.1810.04805, https://arxiv.org/abs/1810.04805

  10. Dong, X., Liu, L., Musial, K., Gabrys, B.: NATS-bench: benchmarking NAS algorithms for architecture topology and size. IEEE Trans. Pattern Anal. Mach. Intell. 44(7), 3634–3646 (2021)

    Google Scholar 

  11. Dong, X., Yang, Y.: NAS-Bench-201: extending the scope of reproducible neural architecture search. In: Proceedings of International Conference Learning Representations (ICLR) (2020)

    Google Scholar 

  12. Dosovitskiy, A., et al.: An image is worth 16\(\times \)16 words: transformers for image recognition at scale (2020). https://doi.org/10.48550/ARXIV.2010.11929, https://arxiv.org/abs/2010.11929

  13. Fonseca, C., Paquete, L., Lopez-Ibanez, M.: An improved dimension-sweep algorithm for the hypervolume indicator. In: 2006 IEEE International Conference on Evolutionary Computation, pp. 1157–1163 (2006). https://doi.org/10.1109/CEC.2006.1688440

  14. Gong, C., et al.: NASVit: neural architecture search for efficient vision transformers with gradient conflict aware supernet training. In: International Conference on Learning Representations (2022). https://openreview.net/forum?id=Qaw16njk6L

  15. Gong, M., Jiao, L., Du, H., Bo, L.: Multiobjective immune algorithm with nondominated neighbor-based selection. Evol. Comput. 16(2), 225–255 (2008). https://doi.org/10.1162/evco.2008.16.2.225

    Article  Google Scholar 

  16. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 770–778 (2016). https://doi.org/10.1109/CVPR.2016.90

  17. Holland, J.H.: Genetic algorithms. Scholarpedia 7, 1482 (2012)

    Article  Google Scholar 

  18. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995 - International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995). https://doi.org/10.1109/ICNN.1995.488968

  19. Kukkonen, S., Lampinen, J.: Gde3: the third evolution step of generalized differential evolution. In: 2005 IEEE Congress on Evolutionary Computation, vol. 1, pp. 443–450 (2005). https://doi.org/10.1109/CEC.2005.1554717

  20. Li, M., Yang, S., Liu, X.: Bi-goal evolution for many-objective optimization problems. Artif. Intell. 228, 45–65 (2015). https://doi.org/10.1016/j.artint.2015.06.007, https://www.sciencedirect.com/science/article/pii/S0004370215000995

  21. Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. CoRR abs/1806.09055 (2018). https://arxiv.org/abs/1806.09055

  22. Lu, Z., Cheng, R., Jin, Y., Tan, K.C., Deb, K.: Neural architecture search as multiobjective optimization benchmarks: problem formulation and performance assessment. arXiv e-prints arXiv:2208.04321 (2022)

  23. Lu, Z., et al.: NSGA-NET: a multi-objective genetic algorithm for neural architecture search. CoRR abs/1810.03522 (2018). https://arxiv.org/abs/1810.03522

  24. Mehrotra, A., et al.: NAS-bench-ASR: reproducible neural architecture search for speech recognition. In: Proceedings of International Conference Learning Representations (ICLR) (2021)

    Google Scholar 

  25. Qin, Y., Zhang, Z., Wang, X., Zhang, Z., Zhu, W.: NAS-bench-graph: benchmarking graph neural architecture search. arXiv preprint arXiv:2206.09166 (2022)

  26. Siems, J., Zimmer, L., Zela, A., Lukasik, J., Keuper, M., Hutter, F.: NAS-bench-301 and the case for surrogate benchmarks for neural architecture search. CoRR abs/2008.09777 (2020). https://arxiv.org/abs/2008.09777

  27. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition (2014). https://doi.org/10.48550/ARXIV.1409.1556, https://arxiv.org/abs/1409.1556

  28. Storn, R., Price, K.V.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11, 341–359 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  29. Su, X., et al.: Prioritized architecture sampling with Monto-Carlo tree search. In: Proceedings of the IEEE Conference on Computer Vision Pattern Recognition (CVPR) (2021)

    Google Scholar 

  30. Tan, M., Chen, B., Pang, R., Vasudevan, V., Le, Q.V.: MnasNet: platform-aware neural architecture search for mobile. CoRR abs/1807.11626 (2018)

    Google Scholar 

  31. Vaswani, A., et al.: Attention is all you need (2017). https://doi.org/10.48550/ARXIV.1706.03762, https://arxiv.org/abs/1706.03762

  32. Wang, H., et al.: HAT: hardware-aware transformers for efficient natural language processing. CoRR abs/2005.14187 (2020)

    Google Scholar 

  33. Xiang, Y., Zhou, Y., Li, M., Chen, Z.: A vector angle-based evolutionary algorithm for unconstrained many-objective optimization. Trans. Evol. Comput. 21(1), 131–152 (2017). https://doi.org/10.1109/TEVC.2016.2587808

    Article  Google Scholar 

  34. Ying, C., Klein, A., Christiansen, E., Real, E., Murphy, K., Hutter, F.: NAS-bench-101: towards reproducible neural architecture search. In: Proceedings of the International Conference on Machine Learning (ICML) (2019)

    Google Scholar 

  35. Zhang, X., Tian, Y., Jin, Y.: A knee point-driven evolutionary algorithm for many-objective optimization. IEEE Trans. Evol. Comput. 19(6), 761–776 (2015). https://doi.org/10.1109/TEVC.2014.2378512

    Article  Google Scholar 

  36. Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 832–842. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30217-9_84

    Chapter  Google Scholar 

  37. Zitzler, E., Laumanns, M., Thiele, L.: SPEA 2: improving the strength pareto evolutionary algorithm (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xukun Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, X. (2023). The Utilities of Evolutionary Multiobjective Optimization for Neural Architecture Search – An Empirical Perspective. In: Pan, L., Zhao, D., Li, L., Lin, J. (eds) Bio-Inspired Computing: Theories and Applications. BIC-TA 2022. Communications in Computer and Information Science, vol 1801. Springer, Singapore. https://doi.org/10.1007/978-981-99-1549-1_15

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-1549-1_15

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-1548-4

  • Online ISBN: 978-981-99-1549-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics