Abstract
Federated Learning is a distributed machine learning technique that allows multiple devices to learn a shared model collaboratively without exchanging their data. It can be used to improve model accuracy while preserving user privacy. But traditional Federated Learning incurs significant communication overhead and does not perform well when the training data are not independent and identically distribute (Non-IID). Therefore, a Federated Learning algorithm based on adaptive Top-k sparsification and OPTICS method is proposed, which solves the problem that Federated Learning has low accuracy and high communication overhead on Non-IID data. Compared to existing Federated Learning algorithm, our algorithm has improved the accuracy of the model and reduced communication overhead.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
McMahan, B., Moore, E., Ramage, D., Hampson, S., Arcas, B.A.: Communication-efficient learning of deep networks from decentralized data. In: Artificial Intelligence and Statistics, pp. 1273–1282. PMLR (2017)
Sattler, F., Wiedemann, S., Müller, K.R., Samek, W.: Robust and communication-efficient Federated Learning from non-IID data. IEEE Trans. Neural Netw. Learn. Syst. 31(9), 3400–3413 (2019)
Chaodong, Y., Jian, C., Geming, X.: Coordinated control of intelligent fuzzy traffic signal based on edge computing distribution. Sensors 5953 (2022)
Sattler, F., Wiedemann, S., Müller, K.R., Samek, W.: Sparse binary compression: towards distributed deep learning with minimal communication. In: 2019 International Joint Conference on Neural Networks (IJCNN), pp. 1–8. IEEE (2019)
Li, X., Huang, K., Yang, W., Wang, S., Zhang, Z.: On the convergence of FedAvg on non-IID data. arXiv preprint arXiv:1907.02189 (2019)
Hsieh, K., Phanishayee, A., Mutlu, O., Gibbons, P.: The non-IID data quagmire of decentralized machine learning. In: International Conference on Machine Learning, pp. 4387–4398. PMLR (2020)
Agrawal, S., Sarkar, S., Alazab, M., Maddikunta, P.K.R., Gadekallu, T.R., Pham, Q.V., et al.: Genetic CFL: hyperparameter optimization in clustered federated learning. Comput. Intell. Neurosci. 2021 (2021)
Zhao, Y., Li, M., Lai, L., Suda, N., Civin, D., Chandra, V.: Federated learning with non-IID data (2018)
Ström, N.: Scalable distributed DNN training using commodity GPU cloud computing (2015)
Rice, R., Plaunt, J.: Adaptive variable-length coding for efficient compression of spacecraft television data. IEEE Trans. Commun. Technol. 19(6), 889–897 (1971)
Aji, A.F., Heafield, K.: Sparse communication for distributed gradient descent. arXiv preprint arXiv:1704.05021 (2017)
Bernstein, J., Wang, Y.-X., Azizzadenesheli, K., Anandkumar, A.: signSGD: compressed optimisation for non-convex problems. In: International Conference on Machine Learning, pp. 560–569. PMLR (2018)
Ghosh, A., Hong, J., Yin, D., Ramchandran, K.: Robust federated learning in a heterogeneous environment (2019)
Kim, Y., Hakim, E.A., Haraldson, J., Eriksson, H., Silva, J., Fischione, C.: Dynamic clustering in federated learning (2021)
Cheng, X., Gang, L., Pramod, K., V.: Federated learning with soft clustering. IEEE Internet Things J. 7773–7782 (2022)
Khan, K., Rehman, S.U., Aziz, K., Fong, S., Sarasvady, S.: Dbscan: past, present and future. In: The Fifth International Conference on the Applications of Digital Information and Web Technologies (ICADIWT 2014), pp. 232–238. IEEE (2014)
Deutsch, P.: RFC 1951: Deflate compressed data format specification version 1.3(1996)
Ziv, J., Lempel, A.: A universal algorithm for sequential data compression. IEEE Trans. Inf. Theory 23(3), 337–343 (1977)
Bryant, A., Cios, K.: RNN-DBSCAN: a density-based clustering algorithm using reverse nearest neighbor density estimates. IEEE Trans. Knowl. Data Eng. 30(6), 1109–1121 (2017)
Ankerst, M., Breunig, M., Kriegel, H.P., Sander, J.: Optics: ordering points to identify the clustering structure. ACM SIGMOD Rec. 49–60 (1999)
Arisdakessian, S., Wahab, O.A., Mourad, A., Otrok, H.: Towards instant clustering approach for federated learning client selection. In: 2023 International Conference on Computing, Networking and Communications (ICNC), pp. 409–413. IEEE (2023)
Geming, X., Jian, C., Chaodong, Y., Jun, M.: Poisoning attacks in federated learning: a survey. IEEE Access 1 (2023)
Dwork, C.: Differential privacy: a survey of results. In: Agrawal, M., Du, D., Duan, Z., Li, A. (eds.) TAMC 2008. LNCS, vol. 4978, pp. 1–19. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-79228-4_1
Liu, R., Cao, Y., Yoshikawa, M., Chen, H.: FedSel: federated SGD under local differential privacy with top-k dimension selection. In: Nah, Y., Cui, B., Lee, S.W., Yu, J.X., Moon, Y.S., Whang, S.E. (eds.) DASFAA 2020. LNCS, vol. 12112, pp. 485–501. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-59410-7_33
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Zhang, D., Xia, G., Liu, Y. (2024). AOPT-FL: A Communication-Efficient Federated Learning Method with Clusterd and Sparsification. In: Tari, Z., Li, K., Wu, H. (eds) Algorithms and Architectures for Parallel Processing. ICA3PP 2023. Lecture Notes in Computer Science, vol 14493. Springer, Singapore. https://doi.org/10.1007/978-981-97-0862-8_20
Download citation
DOI: https://doi.org/10.1007/978-981-97-0862-8_20
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-0861-1
Online ISBN: 978-981-97-0862-8
eBook Packages: Computer ScienceComputer Science (R0)