Skip to main content

Efficient Neural Network Space with Genetic Search

  • Conference paper
  • First Online:

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 1160))

Abstract

We present a novel neural architecture search space and its search strategy with an evolutionary algorithm. It aims to find a set of inverted bottleneck structure blocks, which takes a low-dimensional input representation followed by a compressing layer. Primitive operation layers constitute flexible inverted bottleneck blocks and can be assembled in evolutionary operation. Because the bottleneck structure confines the search space, the proposed evolutionary search algorithm can easily find a competitive neural network despite its small population size. During the search process, we designed to evaluate a model to avoid local minimums: such implementation helped the algorithm to discard local minimums and find better models. We conducted experiments on image classification of Fashion-MNIST, and we discovered an efficiently optimized neural network achieving 6.76 for an error rate with 356 K parameters.

C. W. Ahn—This work was supported by GIST Research Institute (GRI) grant funded by the GIST in 2019.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bello, I., Zoph, B., Vasudevan, V., Le, Q.V.: Neural Optimizer Search with Reinforcement Learning. arXiv e-prints p, September 2017. arXiv:1709.07417

  2. Goldberg, D.E.: Genetic Algorithms in Search Optimization and Machine Learning, 1st edn. Addison-Wesley Longman Publishing Co. Inc., Boston (1989)

    MATH  Google Scholar 

  3. Han, D., Kim, J., Kim, J.: Deep pyramidal residual networks. In: Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, January 2017, pp. 6307–6315, October 2017

    Google Scholar 

  4. He, K., Zhang, X., Ren, S., Sun, J.: Deep Residual Learning for Image Recognition. arXiv e-prints p, December 2015. arXiv:1512.03385

  5. Liu, C., et al.: Progressive Neural Architecture Search. arXiv e-prints p, December 2017. arXiv:1712.00559

  6. Loshchilov, I., Hutter, F.: SGDR: Stochastic gradient descent with warm restarts (2016). arXiv preprint arXiv:1608.03983

  7. Sandler, M., Howard, A.G., Zhu, M., Zhmoginov, A., Chen, L.C.: MobileNetV2: inverted residuals and linear bottlenecks. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 4510–4520, January 2018

    Google Scholar 

  8. Sun, Y., Xue, B., Zhang, M., Yen, G.G.: Evolving deep convolutional neuralnetworks for image classification. IEEE Trans. Evol. Comput. (2019)

    Google Scholar 

  9. Tieleman, T., Hinton, G.: Lecture 6.5-rmsprop: divide the gradient by a running average of its recent magnitude. COURSERA: Neural Netw. Mach. Learn. 4(2), 26–31 (2012)

    Google Scholar 

  10. Xiao, H., Rasul, K., Vollgraf, R.: Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms (2017)

    Google Scholar 

  11. Xie, L., Yuille, A.: Genetic CNN. In: Proceedings of the IEEE International Conference on Computer Vision, October 2017, pp. 1388–1397, March 2017

    Google Scholar 

  12. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning Transferable Architectures for Scalable Image Recognition, July 2017. arXiv e-prints arXiv:1707.07012

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chang Wook Ahn .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kang, D., Ahn, C.W. (2020). Efficient Neural Network Space with Genetic Search. In: Pan, L., Liang, J., Qu, B. (eds) Bio-inspired Computing: Theories and Applications. BIC-TA 2019. Communications in Computer and Information Science, vol 1160. Springer, Singapore. https://doi.org/10.1007/978-981-15-3415-7_54

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-3415-7_54

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-3414-0

  • Online ISBN: 978-981-15-3415-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics