Skip to main content

Federated Neural Architecture Search Evolution and Open Problems: An Overview

  • Conference paper
  • First Online:
Bio-Inspired Computing: Theories and Applications (BIC-TA 2021)

Abstract

As an effective optimization technique that automatically tunes the architecture and hyperparameters of deep neural network models, neural architecture search (NAS) has made significant progress in deep learning model design and automated machine learning (AutoML). However, with the widespread attention to privacy issues, privacy-preserving machine learning approaches have received much attention. Federated learning (FL) is a machine learning paradigm that addresses data privacy issues, mainly facing heterogeneous and distributed scenarios. Therefore, combining FL with NAS can effectively address the privacy issues faced in NAS. Several studies have proposed federated neural architecture search methods, which provide a feasible solution for the joint construction of deep learning models with optimal performance for multiple parties without data sharing. Federated neural architecture search focuses on solving the design challenges of deep neural network models for distributed data and making the models more suitable for heterogeneous scenarios. In this paper, a summary of research related to neural architecture search and FL. We give a review of current work in federated neural architecture search and summarize open issues of existing research. The objective is to offer an overview of a survey on combining FL with NAS, balancing the privacy protection issues of deep neural network models with efficient design. Privacy protection is the primary goal of federated neural architecture search while ensuring model performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Angeline, P.J., Saunders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Netw. 5(1), 54–65 (1994)

    Article  Google Scholar 

  2. Baker, B., Gupta, O., Naik, N., Raskar, R.: Designing neural network architectures using reinforcement learning. arXiv preprint arXiv:1611.02167 (2016)

  3. Briggs, C., Fan, Z., Andras, P.: Federated learning with hierarchical clustering of local updates to improve training on non-IID data. In: 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1–9. IEEE (2020)

    Google Scholar 

  4. Chen, Z., Li, D., Zhao, M., Zhang, S., Zhu, J.: Semi-federated learning. In: 2020 IEEE Wireless Communications and Networking Conference (WCNC), pp. 1–6. IEEE (2020)

    Google Scholar 

  5. Corinzia, L., Beuret, A., Buhmann, J.M.: Variational federated multi-task learning. arXiv preprint arXiv:1906.06268 (2019)

  6. Dinh, C.T., Tran, N.H., Nguyen, T.D.: Personalized federated learning with Moreau envelopes. arXiv preprint arXiv:2006.08848 (2020)

  7. Duan, M., et al.: FedGroup: efficient clustered federated learning via decomposed data-driven measure (2020)

    Google Scholar 

  8. Elsken, T., Metzen, J.H., Hutter, F.: Neural architecture search: a survey. J. Mach. Learn. Res. 20(1), 1997–2017 (2019)

    MathSciNet  MATH  Google Scholar 

  9. Fallah, A., Mokhtari, A., Ozdaglar, A.: Personalized federated learning: a meta-learning approach. arXiv preprint arXiv:2002.07948 (2020)

  10. Gao, Y., Yang, H., Zhang, P., Zhou, C., Hu, Y.: GraphNAS: graph neural architecture search with reinforcement learning. arXiv preprint arXiv:1904.09981 (2019)

  11. Garg, A., Saha, A.K., Dutta, D.: Direct federated neural architecture search. arXiv preprint arXiv:2010.06223 (2020)

  12. Ghosh, A., Chung, J., Yin, D., Ramchandran, K.: An efficient framework for clustered federated learning. arXiv preprint arXiv:2006.04088 (2020)

  13. He, C., Annavaram, M., Avestimehr, S.: Towards non-IID and invisible data with FedNAS: federated deep learning via neural architecture search. arXiv preprint arXiv:2004.08546 (2020)

  14. He, C., et al.: FedML: a research library and benchmark for federated machine learning. arXiv preprint arXiv:2007.13518 (2020)

  15. He, C., Ye, H., Shen, L., Zhang, T.: MiLeNAS: efficient neural architecture search via mixed-level reformulation. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 11993–12002 (2020)

    Google Scholar 

  16. Hsieh, K., Phanishayee, A., Mutlu, O., Gibbons, P.: The non-IID data quagmire of decentralized machine learning. In: International Conference on Machine Learning, pp. 4387–4398. PMLR (2020)

    Google Scholar 

  17. Iandola, F.N., Han, S., Moskewicz, M.W., Ashraf, K., Dally, W.J., Keutzer, K.: SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and \(<\)0.5 mb model size. arXiv preprint arXiv:1602.07360 (2016)

  18. Jaafra, Y., Laurent, J.L., Deruyver, A., Naceur, M.S.: Reinforcement learning for neural architecture search: a review. Image Vis. Comput. 89, 57–66 (2019)

    Article  Google Scholar 

  19. Kairouz, P., et al.: Advances and open problems in federated learning. arXiv preprint arXiv:1912.04977 (2019)

  20. Karimireddy, S.P., Kale, S., Mohri, M., Reddi, S., Stich, S., Suresh, A.T.: SCAFFOLD: stochastic controlled averaging for federated learning. In: International Conference on Machine Learning, pp. 5132–5143. PMLR (2020)

    Google Scholar 

  21. Konečnỳ, J., McMahan, H.B., Yu, F.X., Richtárik, P., Suresh, A.T., Bacon, D.: Federated learning: strategies for improving communication efficiency. arXiv preprint arXiv:1610.05492 (2016)

  22. Kopparapu, K., Lin, E.: FedfMC: sequential efficient federated learning on non-IID data. arXiv preprint arXiv:2006.10937 (2020)

  23. Kopparapu, K., Lin, E., Zhao, J.: FedCD: improving performance in non-IID federated learning. arXiv preprint arXiv:2006.09637 (2020)

  24. Li, L., Khodak, M., Balcan, M.F., Talwalkar, A.: Geometry-aware gradient algorithms for neural architecture search. arXiv preprint arXiv:2004.07802 (2020)

  25. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. arXiv preprint arXiv:1812.06127 (2018)

  26. Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smithy, V.: FedDANE: a federated Newton-type method. In: 2019 53rd Asilomar Conference on Signals, Systems, and Computers, pp. 1227–1231. IEEE (2019)

    Google Scholar 

  27. Li, Z., Kovalev, D., Qian, X., Richtárik, P.: Acceleration for compressed gradient descent in distributed and federated optimization. arXiv preprint arXiv:2002.11364 (2020)

  28. Liang, X., Liu, Y., Luo, J., He, Y., Chen, T., Yang, Q.: Self-supervised cross-silo federated neural architecture search. arXiv preprint arXiv:2101.11896 (2021)

  29. Liu, H., Simonyan, K., Yang, Y.: DARTS: differentiable architecture search. arXiv preprint arXiv:1806.09055 (2018)

  30. McMahan, B., Moore, E., Ramage, D., Hampson, S., Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)

    Google Scholar 

  31. Pathak, R., Wainwright, M.J.: FedSplit: an algorithmic framework for fast federated optimization. arXiv preprint arXiv:2005.05238 (2020)

  32. Real, E., et al.: Large-scale evolution of image classifiers. In: International Conference on Machine Learning, pp. 2902–2911. PMLR (2017)

    Google Scholar 

  33. Reddi, S., et al.: Adaptive federated optimization. arXiv preprint arXiv:2003.00295 (2020)

  34. Reddi, S.J., Konečnỳ, J., Richtárik, P., Póczós, B., Smola, A.: AIDE: fast and communication efficient distributed optimization. arXiv preprint arXiv:1608.06879 (2016)

  35. Ren, P., et al.: A comprehensive survey of neural architecture search: challenges and solutions. ACM Comput. Surv. (CSUR) 54(4), 1–34 (2021)

    Article  Google Scholar 

  36. Santra, S., Hsieh, J.W., Lin, C.F.: Gradient descent effects on differential neural architecture search: a survey. IEEE Access 9, 89602–89618 (2021)

    Article  Google Scholar 

  37. Sattler, F., Müller, K.R., Samek, W.: Clustered federated learning: model-agnostic distributed multitask optimization under privacy constraints. IEEE Trans. Neural Netw. Learn. Syst. 32, 3710–3722 (2020)

    Article  MathSciNet  Google Scholar 

  38. Sengupta, A., Ye, Y., Wang, R., Liu, C., Roy, K.: Going deeper in spiking neural networks: VGG and residual architectures. Front. Neurosci. 13, 95 (2019)

    Article  Google Scholar 

  39. Shamir, O., Srebro, N., Zhang, T.: Communication-efficient distributed optimization using an approximate Newton-type method. In: International Conference on Machine Learning, pp. 1000–1008. PMLR (2014)

    Google Scholar 

  40. Shoham, N., et al.: Overcoming forgetting in federated learning on non-IID data. arXiv preprint arXiv:1910.07796 (2019)

  41. Singh, I., Zhou, H., Yang, K., Ding, M., Lin, B., Xie, P.: Differentially-private federated neural architecture search. arXiv preprint arXiv:2006.10559 (2020)

  42. Smith, V., Chiang, C.K., Sanjabi, M., Talwalkar, A.: Federated multi-task learning. arXiv preprint arXiv:1705.10467 (2017)

  43. Suganuma, M., Shirakawa, S., Nagao, T.: A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation Conference, pp. 497–504 (2017)

    Google Scholar 

  44. Sun, Y., Shao, J., Mao, Y., Zhang, J.: Semi-decentralized federated edge learning for fast convergence on non-IID data. arXiv preprint arXiv:2104.12678 (2021)

  45. Szegedy, C., Ioffe, S., Vanhoucke, V., Alemi, A.A.: Inception-v4, inception-ResNet and the impact of residual connections on learning. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)

    Google Scholar 

  46. Wang, H., Kaplan, Z., Niu, D., Li, B.: Optimizing federated learning on non-IID data with reinforcement learning. In: IEEE INFOCOM 2020-IEEE Conference on Computer Communications, pp. 1698–1707. IEEE (2020)

    Google Scholar 

  47. Wang, H., Yurochkin, M., Sun, Y., Papailiopoulos, D., Khazaeni, Y.: Federated learning with matched averaging. arXiv preprint arXiv:2002.06440 (2020)

  48. Wang, J., Liu, Q., Liang, H., Joshi, G., Poor, H.V.: Tackling the objective inconsistency problem in heterogeneous federated optimization. arXiv preprint arXiv:2007.07481 (2020)

  49. Watkins, C.J.C.H.: Learning from delayed rewards (1989)

    Google Scholar 

  50. Wu, W., He, L., Lin, W., Mao, R., Maple, C., Jarvis, S.A.: SAFA: a semi-asynchronous protocol for fast federated learning with low overhead. IEEE Trans. Comput. 70, 655–668 (2020)

    Article  MathSciNet  Google Scholar 

  51. Xie, L., Yuille, A.: Genetic CNN. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1379–1388 (2017)

    Google Scholar 

  52. Xu, M., Zhao, Y., Bian, K., Huang, G., Mei, Q., Liu, X.: Federated neural architecture search. arXiv preprint arXiv:2002.06352 (2020)

  53. Yao, D., et al.: Federated model search via reinforcement learning (2021)

    Google Scholar 

  54. Yao, X.: Evolving artificial neural networks. Proc. IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  55. Zhang, C., Liang, Y., Yuan, X., Cheng, L.: FDNAS: improving data privacy and model diversity in AutoML. arXiv preprint arXiv:2011.03372 (2020)

  56. Zhang, Y., et al.: CSAFL: a clustered semi-asynchronous federated learning framework. arXiv preprint arXiv:2104.08184 (2021)

  57. Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data. arXiv preprint arXiv:1806.00582 (2018)

  58. Zhong, Z., et al.: BlockQNN: efficient block-wise neural network architecture generation. IEEE Trans. Pattern Anal. Mach. Intell. 43(7), 2314–2328 (2020)

    Article  Google Scholar 

  59. Zhu, H., Jin, Y.: Real-time federated evolutionary neural architecture search. IEEE Trans. Evol. Comput. (2021). https://doi.org/10.1109/TEVC.2021.3099448

  60. Zhu, H., Xu, J., Liu, S., Jin, Y.: Federated learning on non-IID data: a survey. arXiv preprint arXiv:2106.06843 (2021)

  61. Zhu, H., Zhang, H., Jin, Y.: From federated learning to federated neural architecture search: a survey. Complex Intell. Syst. 7(2), 639–657 (2021). https://doi.org/10.1007/s40747-020-00247-z

    Article  Google Scholar 

  62. Zoph, B., Le, Q.V.: Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578 (2016)

  63. Zoph, B., Vasudevan, V., Shlens, J., Le, Q.V.: Learning transferable architectures for scalable image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 8697–8710 (2018)

    Google Scholar 

Download references

Acknowledgements

This work is supported by the General Program of Science and Technology Development Project of Beijing Municipal Education Commission of China (No. KM202110037002), the Youth Fund Project of Beijing Wuzi University (No. 2020XJQN02).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yang Cao .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, D., Cao, Y. (2022). Federated Neural Architecture Search Evolution and Open Problems: An Overview. In: Pan, L., Cui, Z., Cai, J., Li, L. (eds) Bio-Inspired Computing: Theories and Applications. BIC-TA 2021. Communications in Computer and Information Science, vol 1566. Springer, Singapore. https://doi.org/10.1007/978-981-19-1253-5_25

Download citation

  • DOI: https://doi.org/10.1007/978-981-19-1253-5_25

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-19-1252-8

  • Online ISBN: 978-981-19-1253-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics