Abstract
Neural architecture search (NAS) is an important research topic of automated machine learning, which aims to automatically search for neural network architectures that can efficiently learn for a given task. Furthermore, there is an increasing demand for deploying computationally efficient NAS systems that can be used on edge devices that have limited computational powers. In particular, federated learning is an online distributed machine learning scheme that requires online and federated search of neural architectures involving edge devices. However, most exiting NAS methods are naturally not well suited for distributed real-time systems, due to high computational and communication costs. This chapter provides a brief introduction to methods for reducing computational cost in NAS, followed by a presentation of two evolutionary frameworks we recently developed for real-time federated NAS.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
B. Baker, O. Gupta, N. Naik, R. Raskar, Designing neural network architectures using reinforcement learning (2016). arXiv:1611.02167
G. Bender, Understanding and simplifying one-shot architecture search (2019)
H. Brendan McMahan, E. Moore, D. Ramage, S. Hampson, B. Agüera y Arcas, Communication-efficient learning of deep networks from decentralized data (2016). arXiv:1602.05629
M. Buzdalov, I. Yakupov, A. Stankevich, Fast implementation of the steady-state nsga-ii algorithm for two dimensions based on incremental non-dominated sorting, in Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation (2015), pp. 647–654
H. Cai, L. Zhu, S. Han, Proxylessnas: direct neural architecture search on target task and hardware (2018). arXiv:1812.00332
C. Coello Coello, A comprehensive survey of evolutionary-based multiobjective optimization techniques. Knowl. Inf. Syst. 1(3), 269–308 (1999)
C. Coello Coello, Evolutionary multiobjective optimization: open research areas and some challenges lying ahead. Complex Intell. Syst. 6(2), 221–236 (2020)
K. Deb, A. Pratap, S. Agarwal, T.A.M.T. Meyarivan, A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)
X. Dong, Y. Yang, Searching for a robust neural architecture in four gpu hours, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2019), pp. 1761–1770
X. Dong, Y. Yang, Nas-bench-102: extending the scope of reproducible neural architecture search (2020). arXiv:2001.00326
Z. Guo, X. Zhang, H. Mu, W. Heng, Z. Liu, Y. Wei, J. Sun, Single path one-shot neural architecture search with uniform sampling (2019). arXiv:1904.00420
K. He, X. Zhang, S. Ren, J. Sun, Deep residual learning for image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016), pp. 770–778
A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. Weyand, M. Andreetto, H. Adam, Mobilenets: efficient convolutional neural networks for mobile vision applications (2017). arXiv:1704.04861
S. Ioffe, C. Szegedy, Batch normalization: Accelerating deep network training by reducing internal covariate shift (2015). arXiv:1502.03167
Y. Jin, H. Wang, T. Chugh, D. Guo, K. Miettinen, Data-driven evolutionary optimization: an overview and case studies. IEEE Trans. Evol. Comput. 23(3), 442–458 (2019)
K. Kandasamy, W. Neiswanger, J. Schneider, B. Poczos, E.P. Xing, Neural architecture search with Bayesian optimisation and optimal transport, in Advances in Neural Information Processing Systems (2018), pp. 2016–2025
A. Krizhevsky, V. Nair, G. Hinton, Cifar-10 (canadian institute for advanced research) (2010). http://www.cs.toronto.edu/kriz/cifar.html
H. Liu, K. Simonyan, O. Vinyals, C. Fernando, K. Kavukcuoglu, Hierarchical representations for efficient architecture search (2017). arXiv:1711.00436
H. Liu, K. Simonyan, Y. Yang, Darts: differentiable architecture search (2018). arXiv:1806.09055
Z. Lu, I. Whalen, V. Boddeti, Y. Dhebar, K. Deb, E. Goodman, W. Banzhaf, A multi-objective genetic algorithm for neural architecture search, Nsga-net (2018)
Y. Netzer, T. Wang, A. Coates, A. Bissacco, B. Wu, A.Y. Ng, Reading digits in natural images with unsupervised feature learning (2011)
L. Perez, J. Wang, The effectiveness of data augmentation in image classification using deep learning (2017). arXiv:1712.04621
H. Pham, M.Y. Guan, B. Zoph, Q.V. Le, J. Dean, Efficient neural architecture search via parameter sharing (2018). arXiv:1802.03268
E. Real, A. Aggarwal, Y. Huang, Q.V. Le, Regularized evolution for image classifier architecture search, in Proceedings of the aaai Conference on Artificial Intelligence, vol. 33 (2019), pp. 4780–4789
E. Real, S. Moore, A. Selle, S. Saxena, Y. Leon Suematsu, J. Tan, Q.V. Le, A. Kurakin, Large-scale evolution of image classifiers, in Proceedings of the 34th International Conference on Machine Learning, vol. 70 (JMLR. org, 2017), pp. 2902–2911
S. Ruder, An overview of gradient descent optimization algorithms (2016). arXiv:1609.04747
M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, L.-C. Chen, Mobilenetv2: inverted residuals and linear bottlenecks, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 4510–4520
C. Sciuto, K. Yu, M. Jaggi, C. Musat, M. Salzmann, Evaluating the search phase of neural architecture search (2019). arXiv:1902.08142
M. Suganuma, S. Shirakawa, T. Nagao, A genetic programming approach to designing convolutional neural network architectures, in Proceedings of the Genetic and Evolutionary Computation Conference (2017), pp. 497–504
Y. Sun, H. Wang, B. Xue, Y. Jin, G.G. Yen, M. Zhang, Surrogate-assisted evolutionary deep learning using an end-to-end random forest-based performance predictor. IEEE Trans. Evol. Comput. 24(2), 350–364 (2020)
C. White, W. Neiswanger, Y. Savani, BANANAS: bayesian optimization with neural architectures for neural architecture search (2019). arXiv:1910.11858
Y. Xu, L. Xie, X. Zhang, X. Chen, G.-J. Qi, Q. Tian, H. Xiong, Pc-darts: partial channel connections for memory-efficient architecture search, in International Conference on Learning Representations (2019)
I. Yakupov, M. Buzdalov, Improved incremental non-dominated sorting for steady-state evolutionary multiobjective optimization, in Proceedings of the Genetic and Evolutionary Computation Conference (2017), pp. 649–656
I. Yakupov, M. Buzdalov, On asynchronous non-dominated sorting for steady-state multiobjective evolutionary algorithms, in Proceedings of the Genetic and Evolutionary Computation Conference Companion (2018), pp. 205–206
A. Zela, J. Siems, F. Hutter, Nas-bench-1shot1: benchmarking and dissecting one-shot neural architecture search (2020). arXiv:2001.10422
Z. Zhong, J. Yan, W. Wu, J. Shao, C.-L. Liu, Practical block-wise neural network architecture generation, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 2423–2432
Y. Zhou, Y. Jin, J. Ding, Surrogate-assisted evolutionary search of spiking neural architectures in liquid state machines. Neurocomputing (2020)
H. Zhu, Y. Jin, Multi-objective evolutionary federated learning. IEEE Trans. Neural Netw. Learn. Syst. 31(4), 1320–1322 (2020)
B. Zoph, Q.V. Le, Neural architecture search with reinforcement learning (2016). arXiv:1611.01578
B. Zoph, V. Vasudevan, J. Shlens, Q.V. Le, Learning transferable architectures for scalable image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2018), pp. 8697–8710
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Zhu, H., Jin, Y. (2021). Toward Real-Time Federated Evolutionary Neural Architecture Search. In: Pillay, N., Qu, R. (eds) Automated Design of Machine Learning and Search Algorithms. Natural Computing Series. Springer, Cham. https://doi.org/10.1007/978-3-030-72069-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-030-72069-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-72068-1
Online ISBN: 978-3-030-72069-8
eBook Packages: Computer ScienceComputer Science (R0)