Skip to main content

Co-evolution of Novel Tree-Like ANNs and Activation Functions: An Observational Study

  • Conference paper
  • First Online:
AI 2018: Advances in Artificial Intelligence (AI 2018)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11320))

Included in the following conference series:

Abstract

Deep convolutional neural networks (CNNs) represent the state-of-the-art model structure in image classification problems. However, deep CNNs suffer from issues of interpretability and are difficult to train. This work presents new tree-like shallow ANNs, and offers a novel approach to exploring and examining the relationship between activation functions and network performance. The proposed work is examined on the MNIST and CIFAR10 datasets, finding surprising results relating to the necessity and benefit of activation functions in this new type of shallow network. In particular the work finds high accuracy networks for the MNIST dataset which utilise pooling operations as the only non-linearity, and demonstrate a certain invariance to the specific form of activation functions on the more complicated CIFAR10 dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Balduzzi, D., Frean, M., Leary, L., Lewis, J., Ma, K.W.D., McWilliams, B.: The Shattered Gradients Problem: If resnets are the answer, then what is the question? arXiv preprint arXiv:1702.08591 (2017)

  2. Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Trans. Neural Netw. 5(2), 157–166 (1994)

    Article  Google Scholar 

  3. Clevert, D.A., Unterthiner, T., Hochreiter, S.: Fast and accurate deep network learning by exponential linear units (ELUs). arXiv preprint arXiv:1511.07289 (2015)

  4. Glorot, X., Bordes, A., Bengio, Y.: Deep sparse rectifier neural networks. In: Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, pp. 315–323 (2011)

    Google Scholar 

  5. Graham, B.: Fractional max-pooling. arXiv preprint arXiv:1412.6071 (2014)

  6. Hagg, A., Mensing, M., Asteroth, A.: Evolving parsimonious networks by mixing activation functions. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 425–432. ACM (2017)

    Google Scholar 

  7. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)

    Google Scholar 

  8. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  9. Koza, J.R.: Genetic programming as a means for programming computers by natural selection. Stat. Comput. 4(2), 87–112 (1994)

    Article  Google Scholar 

  10. Liu, H., Simonyan, K., Vinyals, O., Fernando, C., Kavukcuoglu, K.: Hierarchical representations for efficient architecture search. arXiv preprint arXiv:1711.00436 (2017)

  11. McDonnell, M.D., Vladusich, T.: Enhanced image classification with a fast-learning shallow convolutional neural network. arXiv preprint arXiv:1503.04596 (2015)

  12. Potter, M.A., Jong, K.A.D.: Cooperative coevolution: an architecture for evolving coadapted subcomponents. Evol. Comput. 8(1), 1–29 (2000)

    Article  Google Scholar 

  13. Ramachandran, P., Zoph, B., Le, Q.V.: Searching for Activation Functions (2018). https://openreview.net/forum?id=SkBYYyZRZ

  14. Real, E., et al.: Large-scale evolution of image classifiers. arXiv preprint arXiv:1703.01041 (2017)

  15. Spears, W.M.: Adapting crossover in evolutionary algorithms. In: Evolutionary Programming, pp. 367–384 (1995)

    Google Scholar 

  16. Springenberg, J.T., Dosovitskiy, A., Brox, T., Riedmiller, M.: Striving for simplicity: the all convolutional net. arXiv preprint arXiv:1412.6806 (2014)

  17. Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15(1), 1929–1958 (2014)

    MathSciNet  MATH  Google Scholar 

  18. Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15(2), 185–212 (2009)

    Article  Google Scholar 

  19. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)

    Article  Google Scholar 

  20. Szegedy, C., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–9 (2015)

    Google Scholar 

  21. Turner, A.J., Miller, J.F.: Recurrent Cartesian genetic programming of artificial neural networks. Genet. Program. Evolvable Mach. 18(2), 185–212 (2017)

    Article  Google Scholar 

  22. Xie, L., Yuille, A.: Genetic CNN. arXiv preprint arXiv:1703.01513 (2017)

  23. Xu, B., Wang, N., Chen, T., Li, M.: Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853 (2015)

  24. Zhong, Z., Yan, J., Liu, C.L.: Practical Network Blocks Design with Q-Learning, abs/1708.05552. CoRR (2017)

    Google Scholar 

  25. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016)

  26. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. arXiv preprint arXiv:1707.070122(6) (2017)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bing Xue .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

O’Neill, D., Xue, B., Zhang, M. (2018). Co-evolution of Novel Tree-Like ANNs and Activation Functions: An Observational Study. In: Mitrovic, T., Xue, B., Li, X. (eds) AI 2018: Advances in Artificial Intelligence. AI 2018. Lecture Notes in Computer Science(), vol 11320. Springer, Cham. https://doi.org/10.1007/978-3-030-03991-2_56

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-03991-2_56

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-03990-5

  • Online ISBN: 978-3-030-03991-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics