Abstract
In the realm of practical applications of federated learning, an issue arises wherein the performance suffers due to passive client disconnections during the federated training process, caused by factors such as resource limitations or network disruptions. This paper introduces a More Precise Similarity Discovery and Gradient Supplementation(MPSDGS) algorithm, which tackles the problem of passive client dropout in federated learning by employing precise clustering techniques to identify similar clients. It further leverages the gradients of clients whose data distribution closely aligns with the disconnected clients, effectively supplementing the disconnected client gradients. The algorithm’s efficacy is verified through experimental evaluations conducted on real-world datasets, namely MNIST, CIFAR10, and CIFAR100. The experimental findings reveal that, under the same non-independent and identically distributed data partitioning approach for MNIST and CIFAR10 datasets, MPSDGS achieves notable accuracy enhancements. Specifically, at disconnection rates of 0.3, 0.5, and 0.7, the MPSDGS algorithm improves the accuracy of the MNIST dataset by 1.33%, 1.49%, and 1.35%, respectively. Similarly, for the CIFAR10 dataset, the algorithm enhances accuracy by 1.09%, 1.25%, and 1.6%, respectively, at the aforementioned disconnection rates. Remarkably, MPSDGS exhibits comparable excellence in performance on the CIFAR100 dataset.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bharati, S., Mondal, M., Podder, P., Prasath, V.: Federated learning: applications, challenges and future scopes. Inter. J. Hybrid Intell. Syst. (Preprint), 1–17 (2022)
Zhou, X., Sun, Y., Wang, D., Ge, H.: Survey of federated learning research Chinese journal of network and information. Security 7(5), 77–92 (2021)
Lim, W.Y.B., et al.: Federated learning in mobile edge networks: a comprehensive survey. IEEE Commun. Surv. Tutorials 22(3), 2031–2063 (2020)
Yang, Q., Tong, Y., Wang, Y., et al.: A survey of federated learning algorithms in swarm intelligence. J. Intel. Sci. Technol. 4(1), 29–44 (2022)
Wang, H., Xu, J.: Combating client dropout in federated learning via friend model substitution (2023)
Hartigan, J.A., Wong, M.A.: Algorithm as 136: A k-means clustering algorithm. J. Royal Statist. Soc.. Series c (Appli. Statist.) 28(1), 100–108 (1979)
Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-iid data. arXiv preprint arXiv:1806.00582 (2018)
Rodgers, J.L., Nicewander, W.A.: Thirteen ways to look at the correlation coefficient. American Statist., 59–66 (1988)
Shamir, A.: How to share a secret. Commun. ACM 22(11), 612–613 (1979)
Blakley, G.R.: Safeguarding cryptographic keys. In: Managing Requirements Knowledge, International Workshop on, pp. 313–313. IEEE Computer Society (1979)
Dwork, C., McSherry, F., Nissim, K., Smith, A.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) TCC 2006. LNCS, vol. 3876, pp. 265–284. Springer, Heidelberg (2006). https://doi.org/10.1007/11681878_14
Huang, T., Lin, W., Wu, W., He, L., Li, K., Zomaya, A.Y.: An efficiency-boosting client selection scheme for federated learning with fairness guarantee. IEEE Trans. Parallel Distrib. Syst. 32(7), 1552–1564 (2020)
Wang,H., Kaplan,Z., Niu,D., et al.: Optimizing federated learning on non-iid data with re-inforcement learning. In: IEEE INFOCOM 2020-IEEE Conference on Computer Communications IEEE, pp.1698-1707(2020)
Ribero, M., Vikalo, H.: Communication-efficient federated learning via optimal client sampling. arXiv preprint arXiv:2007.15197 (2020)
Wang, H., Kaplan, Z., Niu, D., Li, B.: Optimizing federated learning on non-iid data with reinforcement learning. In: IEEE INFOCOM 2020-IEEE Conference on Computer Communications, pp. 1698–1707. IEEE (2020)
Lai, F., Zhu, X., Madhyastha, H.V., Chowdhury, M.: Oort: efficient federated learning via guided participant selection. In: OSDI, pp. 19–35 (2021)
Wu, H., Wang, P.: Node selection toward faster convergence for federated learning on non-iid data. IEEE Trans. Netw. Sci. Eng. 9(5), 3099–3111 (2022)
China Information and Communication Research Institute, Alibaba (China) Co. , Ltd. , Beijing Digital Bamboo Technology Co. , Ltd. Privacy Protection Computing Technology Research Report (2020)
Shao, J., Sun, Y., Li, S., Zhang, J.: Dres-fl: dropout-resilient secure federated learning for non-iid clients via secret data sharing. arXiv preprint arXiv:2210.02680 (2022)
Zhu, J., Li, S.: Generalized lagrange coded computing: a flexible computation-communication tradeoff. In: 2022 IEEE International Symposium on Information Theory (ISIT), pp. 832–837. IEEE (2022)
Lu, H., Wang, L.: User-oriented data privacy preserving method for federated learning that supports user disconnection. Netinfo Sec. 21(3), 64–71 (2021)
Luo, C., Zhan, J., Xue, X., Wang, L., Ren, R., Yang, Q.: Cosine normalization: using cosine similarity instead of dot product in neural networks. In: Kůrková, V., Manolopoulos, Y., Hammer, B., Iliadis, L., Maglogiannis, I. (eds.) ICANN 2018. LNCS, vol. 11139, pp. 382–391. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01418-6_38
Meyes, R., Lu, M., de Puiseau, C.W., Meisen, T.: Ablation studies in artificial neural networks. arXiv preprint arXiv:1901.08644 (2019)
Deng, L.: The mnist database of handwritten digit images for machine learning research [best of the web]. IEEE Signal Process. Mag. 29(6), 141–142 (2012)
Krizhevsky, A., Hinton, G., et al.: Learning multiple layers of features from tiny images (2009)
Collins, L., Hassani, H., Mokhtari, A., Shakkottai, S.: Exploiting shared representations for personalized federated learning. In: International Conference on Machine Learning, pp. 2089–2099. PMLR (2021)
Acknowledgement
We would like to express our gratitude for the insightful feedback provided by the reviewers of ICA3PP. This work is supported by Shandong Provincial Natural Science Foundation(NO.ZR2022MF264).
Author information
Authors and Affiliations
Corresponding authors
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yan, M., Luo, Q., Zhang, B., Sun, S. (2024). Solving Client Dropout in Federated Learning via Client Similarity Discovery and Gradient Supplementation Mechanism. In: Tari, Z., Li, K., Wu, H. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2023. Lecture Notes in Computer Science, vol 14491. Springer, Singapore. https://doi.org/10.1007/978-981-97-0808-6_26
Download citation
DOI: https://doi.org/10.1007/978-981-97-0808-6_26
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-0807-9
Online ISBN: 978-981-97-0808-6
eBook Packages: Computer ScienceComputer Science (R0)