Abstract:
Federated learning is a nascent distributed machine learning paradigm that enables multiple clients to collaborate in training a model for a specific task under the coord...Show MoreMetadata
Abstract:
Federated learning is a nascent distributed machine learning paradigm that enables multiple clients to collaborate in training a model for a specific task under the coordination of a central server, all while safeguarding the privacy of the user’s local data. Nevertheless, the constraint that distributed datasets must remain within local nodes introduces data heterogeneity in federated learning training. In this paper, we focus on how to mitigate the damage caused by the data heterogeneity of feature distribution skew in federated learning models during training. To achieve this goal, we propose a feature drift-corrected federated learning algorithm. We design a feature drift variable derived from the local models of clients and the global model of the server. This variable is incorporated into the client’s local loss function to rectify local model parameters. Additionally, we utilize the disparity between the global models before and after to regulate the local model. Validation experiments are conducted on multiple datasets exhibiting feature distribution skew. The implementation results demonstrate the efficacy of our approach in significantly enhancing the model performance of federated learning under feature distribution skew.
Date of Conference: 08-11 July 2024
Date Added to IEEE Xplore: 11 October 2024
ISBN Information: