Abstract:
Federated Learning (FL) has been increasingly used in various edge device applications. Cluster-wise separate Federated Learning (CFL) is an effective means of addressing...Show MoreMetadata
Abstract:
Federated Learning (FL) has been increasingly used in various edge device applications. Cluster-wise separate Federated Learning (CFL) is an effective means of addressing the low accuracy issue resulting from data heterogeneity in FL. In CFL, nearby edge devices form a cluster and each cluster trains a separate machine learning (ML) model. As many edge devices (e.g., cellular phones) are mobile, a device may leave its cluster during training, which degrades the training accuracy. However, the existing works on CFL do not consider the mobility of the devices. We propose enabling such devices to keep participating in the trainings for its previous clusters and also the current cluster. However, due to constrained resources and high workload of multiple CFLs, such devices may become stragglers, affecting training time and accuracy. Also, cluster environment changes and less-visited clusters will generate low model accuracy. In this paper, we first conducted experimental analysis to verify the motivation and illustrate the problems. Then, to address the problems, we propose Clustered Federated Learning for Mobile edge devices (CFLM). CFLM decreases training time by sharing training data and computation results between multiple trainings. Additionally, CFLM increases accuracy by handling dynamic environment and by increasing the accuracy of lessvisited clusters. Our extensive evaluations on both CPU and GPU devices show that CFLM decreases training time by up to 68% and increases accuracy by up to 18% compared to the existing works.
Published in: 2024 IEEE/ACM Symposium on Edge Computing (SEC)
Date of Conference: 04-07 December 2024
Date Added to IEEE Xplore: 01 January 2025
ISBN Information: