Loading [a11y]/accessibility-menu.js
A Communication Efficient Approach of Global Training in Federated Learning | IEEE Conference Publication | IEEE Xplore

A Communication Efficient Approach of Global Training in Federated Learning


Abstract:

Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares...Show More

Abstract:

Federated learning is a privacy preserving method of training the model on server by utilizing the end users' private data without accessing it. The central server shares the global model with all end users, called clients of the network. The clients are required to train the shared global model using their local datasets. The updated local trained models are forwarded back to the server to further update the global model. This process of training the global model is carried out for several rounds. The procedure of updating the local model and transmitting back to the server rises the communication cost. Since several clients are involved in training the global model, the aggregated communication cost of the network is escalated. This article proposes a communication effective aggregation method for federated learning, which considers the volume and variety of local clients' data before aggregation. The proposed approach is compared with the conventional methods and it achieves highest accuracy and minimum loss with respect to aggregated communication cost.
Date of Conference: 19-21 October 2022
Date Added to IEEE Xplore: 25 November 2022
ISBN Information:

ISSN Information:

Conference Location: Jeju Island, Korea, Republic of

Funding Agency:


References

References is not available for this document.