Abstract
Federated learning (FL) has gained tremendous attention across different machine learning tasks. In large-scale deployments, client heterogeneity is a fact and imposes constraints on model design, training performance and accuracy. This paper introduces a fine-grained federated learning (FGFL) method to tackle resource heterogeneity. FGFL utilizes a configurable architecture search space in order to offer abundant architectures for various devices. FGFL first employs a greedy coarse-grained architecture selection method and a local training optimization strategy to enable most architectures to be readily deployable. Additionally, it conducts a fine-grained multi-objective evolutionary search to automatically identify the optimal architectures for heterogeneous devices. Experimental results demonstrate that FGFL achieves the superior performance while reducing computational costs.
This work is supported in part by the Natural Science Foundation of Guangdong Province, China, under Grants 2024A1515010697 and 2020A1515011491, in part by the Science Research Project of Guangzhou University under Grant YG2020008, and in part by the Open Project Program of Key Laboratory of Intelligent Optimization and Information, Minnan Normal University, under Grant ZNYH202401.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Cai, H., Gan, C., Wang, T., Zhang, Z., Han, S.: Once-for-all: train one network and specialize it for efficient deployment. arXiv preprint arXiv:1908.09791 (2019)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)
Diao, E., Ding, J., Tarokh, V.: HeteroFL: computation and communication efficient federated learning for heterogeneous clients. arXiv preprint arXiv:2010.01264 (2020)
Dudziak, L., Laskaridis, S., Fernandez-Marques, J.: FedorAS: federated architecture search under system heterogeneity. arXiv preprint arXiv:2206.11239 (2022)
Horvath, S., Laskaridis, S., Almeida, M., Leontiadis, I., Venieris, S., Lane, N.: FjORD: fair and accurate federated learning under heterogeneous targets with ordered dropout. Adv. Neural. Inf. Process. Syst. 34, 12876–12889 (2021)
Kang, H., Cha, S., Shin, J., Lee, J., Kang, J.: NeFL: nested federated learning for heterogeneous clients. arXiv preprint arXiv:2308.07761 (2023)
Kim, M., Yu, S., Kim, S., Moon, S.M.: DepthFL: depthwise federated learning for heterogeneous clients. In: The Eleventh International Conference on Learning Representations (2022)
Krizhevsky, A.: Learning multiple layers of features from tiny images. Master’s thesis, Department of Computer Science, University of Toronto (2009)
Li, T., Sahu, A.K., Zaheer, M., Sanjabi, M., Talwalkar, A., Smith, V.: Federated optimization in heterogeneous networks. Proc. Mach. Learn. Syst. 2, 429–450 (2020)
Li, X., Jiang, M., Zhang, X., Kamp, M., Dou, Q.: FedBN: federated learning on non-IID features via local batch normalization. arXiv preprint arXiv:2102.07623 (2021)
Pham, H., Guan, M., Zoph, B., Le, Q., Dean, J.: Efficient neural architecture search via parameters sharing. In: International Conference on Machine Learning, pp. 4095–4104. PMLR (2018)
Yu, J., Huang, T.S.: Universally slimmable networks and improved training techniques. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 1803–1811 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Ying, W., Wang, C., Wu, Y., Luo, X., Wen, Z., Zhang, H. (2024). FGFL: Fine-Grained Federated Learning Based on Neural Architecture Search for Heterogeneous Clients. In: Tan, Y., Shi, Y. (eds) Advances in Swarm Intelligence. ICSI 2024. Lecture Notes in Computer Science, vol 14789. Springer, Singapore. https://doi.org/10.1007/978-981-97-7184-4_9
Download citation
DOI: https://doi.org/10.1007/978-981-97-7184-4_9
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-7183-7
Online ISBN: 978-981-97-7184-4
eBook Packages: Computer ScienceComputer Science (R0)