Skip to main content

Constraint Exploration of Convolutional Network Architectures with Neuroevolution

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11507))

Abstract

The effort spent on adapting existing networks to new applications has motivated the automated architecture search. Network structures discovered with evolutionary or other search algorithms have surpassed hand-crafted image classifiers in terms of accuracy. However, these approaches do not constrain certain characteristics like network size, which leads to unnecessary computational effort. Thus, this work shows that generational evolutionary algorithms can be used for a constrained exploration of convolutional network architectures to create a selection of networks for a specific application or target architecture.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Bergstra, J., Bengio, Y.: Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012)

    MathSciNet  MATH  Google Scholar 

  2. Castillo, P.A., Merelo, J., Prieto, A., Rivas, V., Romero, G.: G-Prop: global optimization of multilayer perceptrons using gas. Neurocomputing 35(1–4), 149–163 (2000)

    Article  Google Scholar 

  3. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Li, F.-F.: ImageNet: a large-scale hierarchical image database (2009)

    Google Scholar 

  4. Elsken, T., Metzen, J.H., Hutter, F.: Simple and efficient architecture search for convolutional neural networks. arXiv preprint arXiv:1711.04528 (2017)

  5. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J., Blum, M., Hutter, F.: Efficient and robust automated machine learning. In: Cortes, C., Lawrence, N.D., Lee, D.D., Sugiyama, M., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 28, pp. 2962–2970. Curran Associates Inc., New York (2015)

    Google Scholar 

  6. Fogel, D.B., Fogel, L.J., Porto, V.: Evolving neural networks. Biol. Cybern. 63(6), 487–493 (1990)

    Article  Google Scholar 

  7. Han, S., Mao, H., Dally, W.J.: Deep compression: compressing deep neural networks with pruning, trained quantization and Huffman coding. arXiv preprint arXiv:1510.00149 (2015)

  8. Homburg, J.D.: JonasDHomburg/CECNAN: Release v1.0, February 2019. https://doi.org/10.5281/zenodo.2580089

  9. Jin, H., Song, Q., Hu, X.: Auto-Keras: efficient neural architecture search with network morphism (2018)

    Google Scholar 

  10. Koza, J.R.: Genetic programming as a means for programming computers by natural selection. Stat. Comput. 4(2), 87–112 (1994)

    Article  Google Scholar 

  11. Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report, Citeseer (2009)

    Google Scholar 

  12. LeCun, Y., Cortes, C., Burges, C.: The MNIST database of handwritten digits. The Courant Institute of Mathematical Sciences (1998)

    Google Scholar 

  13. Liu, C., et al.: Progressive neural architecture search. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11205, pp. 19–35. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01246-5_2

    Chapter  Google Scholar 

  14. Liu, Y., Yao, X.: Evolving modular neural networks which generalise well. In: Proceedings of 1997 IEEE International Conference on Evolutionary Computation (ICEC 1997), pp. 605–610. IEEE (1997)

    Google Scholar 

  15. Olson, R.S., Bartley, N., Urbanowicz, R.J., Moore, J.H.: Evaluation of a tree-based pipeline optimization tool for automating data science. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, GECCO 2016, pp. 485–492. ACM, New York (2016). https://doi.org/10.1145/2908812.2908918

  16. Pham, H., Guan, M.Y., Zoph, B., Le, Q.V., Dean, J.: Efficient neural architecture search via parameter sharing. arXiv preprint arXiv:1802.03268 (2018)

  17. Real, E., Aggarwal, A., Huang, Y., Le, Q.V.: Regularized evolution for image classifier architecture search. arXiv preprint arXiv:1802.01548 (2018)

  18. Real, E., et al.: Large-scale evolution of image classifiers. arXiv preprint arXiv:1703.01041 (2017)

  19. Romanuke, V.V.: Training data expansion and boosting of convolutional neural networks for reducing the MNIST dataset error rate (2016)

    Google Scholar 

  20. Schrum, J.: Evolving indirectly encoded convolutional neural networks to play tetris with low-level features (2018)

    Google Scholar 

  21. Simard, P.Y., Steinkraus, D., Platt, J.C.: Best practices for convolutional neural networks applied to visual document analysis. In: Proceedings of the Seventh International Conference on Document Analysis and Recognition, vol. 2, p. 958. IEEE (2003)

    Google Scholar 

  22. Stanley, K.O., Bryant, B.D., Miikkulainen, R.: Real-time neuroevolution in the NERO video game. IEEE Trans. Evol. Comput. 9(6), 653–668 (2005)

    Article  Google Scholar 

  23. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  24. Verbancsics, P., Harguess, J.: Generative neuroevolution for deep learning. arXiv preprint arXiv:1312.5355 (2013)

  25. Yao, X., Liu, Y.: Towards designing artificial neural networks by evolution. Appl. Math. Comput. 91(1), 83–90 (1998)

    MATH  Google Scholar 

  26. Zagoruyko, S., Komodakis, N.: Wide residual networks. arXiv preprint arXiv:1605.07146 (2016)

  27. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016)

Download references

Acknowledgement

The research leading to these results has received funding from the European Union Seventh Framework Programme (FP7) under grant agreement no 604102 and the EU’s Horizon 2020 research and innovation programme under grant agreements No 720270 (Human Brian Project, HBP). It has been further supported by the European Fund for Regional Development under Grant IT-1-2-001 and the Cluster of Excellence Cognitive Interaction Technology “CITEC” (EXC 277) at Bielefeld University, which is funded by the German Research Foundation (DFG).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Jonas Dominik Homburg , Michael Adams , Michael Thies , Timo Korthals , Marc Hesse or Ulrich Rückert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Homburg, J.D., Adams, M., Thies, M., Korthals, T., Hesse, M., Rückert, U. (2019). Constraint Exploration of Convolutional Network Architectures with Neuroevolution. In: Rojas, I., Joya, G., Catala, A. (eds) Advances in Computational Intelligence. IWANN 2019. Lecture Notes in Computer Science(), vol 11507. Springer, Cham. https://doi.org/10.1007/978-3-030-20518-8_61

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-20518-8_61

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-20517-1

  • Online ISBN: 978-3-030-20518-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics