Abstract:
Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares...Show MoreMetadata
Abstract:
Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost.
Published in: 2022 13th International Conference on Information and Communication Technology Convergence (ICTC)
Date of Conference: 19-21 October 2022
Date Added to IEEE Xplore: 25 November 2022
ISBN Information: