Skip to main content

Efficient Evolutionary Neural Architecture Search (NAS) by Modular Inheritable Crossover

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1159))

Abstract

The convolution neural network is prominent in image processing, and a large number of excellent deep neural networks have been proposed in recent years. However, the hand-actuated design of a neural network is time-consuming, laborious, and challenging. Thus many neural architecture search (NAS) methods have been proposed, among which the evolutionary NAS methods have achieved encouraging results due to the global search capability of evolutionary algorithms. Nevertheless, most evolutionary NAS methods use only mutation operators for offspring generation, and the generated offspring networks could be quite different from their parent networks. To address this deficiency, we propose an efficient evolutionary NAS method using a tailored crossover operator. Different from existing mutation operators, the proposed crossover operator enables the offspring network to inherit promising modular from their parent networks. Experimental results indicate that our proposed evolutionary NAS method has achieved competitive results in comparison with some state-of-the-art NAS methods. Moreover, the effectiveness of our proposed modular inheritable crossover operator for offspring generation is validated.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Blickle, T., Thiele, L.: A mathematical analysis of tournament selection. In: ICGA, vol. 95, pp. 9–15. Citeseer (1995)

    Google Scholar 

  2. Cai, H., Chen, T., Zhang, W., Yu, Y., Wang, J.: Reinforcement learning for architecture search by network transformation (2017). arXiv preprint arXiv:1707.04873

  3. Chen, Y., Li, J., Xiao, H., Jin, X., Yan, S., Feng, J.: Dual path networks. In: Advances in Neural Information Processing Systems, pp. 4467–4475 (2017)

    Google Scholar 

  4. Chollet, F.: Xception: deep learning with depthwise separable convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1251–1258 (2017)

    Google Scholar 

  5. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  6. Hu, J., Shen, L., Sun, G.: Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7132–7141 (2018)

    Google Scholar 

  7. Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4700–4708 (2017)

    Google Scholar 

  8. Jin, H., Song, Q., Hu, X.: Efficient neural architecture search with network morphism (2018). arXiv preprint arXiv:1806.10282. 9

  9. Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images. Technical report Citeseer (2009)

    Google Scholar 

  10. Liu, C., et al.: Progressive neural architecture search. In: Proceedings of the European Conference on Computer Vision (ECCV), pp. 19–34 (2018)

    Google Scholar 

  11. Liu, H., Simonyan, K., Vinyals, O., Fernando, C., Kavukcuoglu, K.: Hierarchical representations for efficient architecture search (2017). arXiv preprint arXiv:1711.00436

  12. Loshchilov, I., Hutter, F.: SGDR: Stochastic gradient descent with warm restarts (2016). arXiv preprint arXiv:1608.03983

  13. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 4780–4789 (2019)

    Google Scholar 

  14. Real, E., et al.: Large-scale evolution of image classifiers. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 2902–2911. JMLR. org (2017)

    Google Scholar 

  15. Ruder, S.: An overview of gradient descent optimization algorithms (2016). arXiv preprint arXiv:1609.04747

  16. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  17. Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2818–2826 (2016)

    Google Scholar 

  18. Wen, W., Yan, F., Li, H.: Autogrow: automatic layer growing in deep convolutional networks (2019). arXiv preprint arXiv:1906.02909

  19. Wistuba, M.: Deep learning architecture search by neuro-cell-based evolution with function-preserving mutations. In: Berlingerio, M., Bonchi, F., Gärtner, T., Hurley, N., Ifrim, G. (eds.) ECML PKDD 2018. LNCS (LNAI), vol. 11052, pp. 243–258. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-10928-8_15

    Chapter  Google Scholar 

  20. Zhong, Z., Yan, J., Wu, W., Shao, J., Liu, C.L.: Practical block-wise neural network architecture generation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 2423–2432 (2018)

    Google Scholar 

  21. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8697–8710 (2018)

    Google Scholar 

Download references

Acknowledgment

This work was supported in part by the National Natural Science Foundation of China (No. 61903178 and 61906081), in part by the Program for Guangdong Introducing Innovative and Entrepreneurial Teams grant (No. 2017ZT07X386), and in part by the Shenzhen Peacock Plan grant (No. KQTD2016112514355531),

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Cheng He or Ran Cheng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Tan, H., He, C., Tang, D., Cheng, R. (2020). Efficient Evolutionary Neural Architecture Search (NAS) by Modular Inheritable Crossover. In: Pan, L., Liang, J., Qu, B. (eds) Bio-inspired Computing: Theories and Applications. BIC-TA 2019. Communications in Computer and Information Science, vol 1159. Springer, Singapore. https://doi.org/10.1007/978-981-15-3425-6_61

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-3425-6_61

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-3424-9

  • Online ISBN: 978-981-15-3425-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics