Skip to main content

Enhancing Multi-objective Evolutionary Neural Architecture Search with Surrogate Models and Potential Point-Guided Local Searches

  • Conference paper
  • First Online:
Book cover Advances and Trends in Artificial Intelligence. Artificial Intelligence Practices (IEA/AIE 2021)

Abstract

In this paper, we investigate two methods to enhance the efficiency of multi-objective evolutionary algorithms (MOEAs) when solving Neural Architecture Search (NAS) problems. The first method is to use a surrogate model to predict the accuracy of candidate architectures. Only promising architectures with high predicted accuracy values would then be truly trained and evaluated while the ones with low predicted accuracy would be discarded. The second method is to perform local search for potential solutions on the non-dominated front after each MOEA generation. To demonstrate the effectiveness of the proposed methods, we conduct experiments on benchmark datasets of both macro-level (MacroNAS) and micro-level (NAS-Bench-101, NAS-Bench-201) NAS problems. Experimental results exhibit that the proposed methods achieve improvements on the convergence speed of MOEAs toward Pareto-optimal fronts, especially for macro-level NAS problems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bosman, P.A.N., Thierens, D.: The balance between proximity and diversity in multiobjective evolutionary algorithms. IEEE Trans. Evol. Comput. 7(2), 174–188 (2003)

    Article  Google Scholar 

  2. Branke, J., Deb, K., Dierolf, H., Osswald, M.: Finding knees in multi-objective optimization. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 722–731. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-30217-9_73

    Chapter  Google Scholar 

  3. Cai, H., Gan, C., Wang, T., Zhang, Z., Han, S.: Once-for-all: train one network and specialize it for efficient deployment. In: Proceedings of the International Conference on Learning Representations (ICLR) (2020)

    Google Scholar 

  4. Dai, X., et al.: ChamNet: towards efficient network design through platform-aware model adaptation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2019)

    Google Scholar 

  5. Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms. Wiley-Interscience Series in Systems and Optimization. Wiley, Hoboken (2001)

    MATH  Google Scholar 

  6. Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)

    Article  Google Scholar 

  7. Dong, X., Yang, Y.: NAS-Bench-201: Extending the scope of reproducible neural architecture search. In: Proceedings of the International Conference on Learning Representations (ICLR) (2020)

    Google Scholar 

  8. Elsken, T., Metzen, J.H., Hutter, F.: Efficient multi-objective neural architecture search via Lamarckian evolution. In: Proceedings of the International Conference on Learning Representations (ICLR) (2019)

    Google Scholar 

  9. Howard, A.G., et al.: MobileNets: efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861 (2017)

  10. Kim, Y.H., Reddy, B., Yun, S., Seo, C.: NEMO: neuro-evolution with multiobjective optimization of deep neural network for speed and accuracy. In: ICML 2017 AutoML Workshop (2017)

    Google Scholar 

  11. Liu, C., et al.: Progressive neural architecture search. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 19–35. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01246-5_2

    Chapter  Google Scholar 

  12. Lu, Z., Deb, K., Goodman, E., Banzhaf, W., Boddeti, V.N.: NSGANetV2: evolutionary multi-objective surrogate-assisted neural architecture search. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12346, pp. 35–51. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58452-8_3

    Chapter  Google Scholar 

  13. Lu, Z., et al.: NSGA-Net: neural architecture search using multi-objective genetic algorithm. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO) (2019)

    Google Scholar 

  14. Luong, H.N., Bosman, P.A.N.: Elitist archiving for multi-objective evolutionary algorithms: to adapt or not to adapt. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012. LNCS, vol. 7492, pp. 72–81. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-32964-7_8

    Chapter  Google Scholar 

  15. Ottelander, T.D., Dushatskiy, A., Virgolin, M., Bosman, P.A.N.: Local search is a remarkably strong baseline for neural architecture search. In: Proceedings of the International Conference on Evolutionary Multi-Criterion Optimization (EMO) (2021)

    Google Scholar 

  16. Ying, C., Klein, A., Christiansen, E., Real, E., Murphy, K., Hutter, F.: Nas-bench-101: towards reproducible neural architecture search. In: International Conference on Machine Learning. pp. 7105–7114 (2019)

    Google Scholar 

Download references

Acknowledgements

This research is funded by University of Information Technology - Vietnam National University HoChiMinh City under grant number D1-2021-09.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ngoc Hoang Luong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Phan, Q.M., Luong, N.H. (2021). Enhancing Multi-objective Evolutionary Neural Architecture Search with Surrogate Models and Potential Point-Guided Local Searches. In: Fujita, H., Selamat, A., Lin, J.CW., Ali, M. (eds) Advances and Trends in Artificial Intelligence. Artificial Intelligence Practices. IEA/AIE 2021. Lecture Notes in Computer Science(), vol 12798. Springer, Cham. https://doi.org/10.1007/978-3-030-79457-6_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-79457-6_39

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-79456-9

  • Online ISBN: 978-3-030-79457-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics