Abstract:
In federated learning scenarios, data heterogeneity can significantly impact performance. Personalized federated learning seeks to provide individualized models for each ...Show MoreMetadata
Abstract:
In federated learning scenarios, data heterogeneity can significantly impact performance. Personalized federated learning seeks to provide individualized models for each client to enhance convergence on heterogeneous data. We discovered that initially training the personalized layers, also known as the head, of the model first can alleviate the effects of data heterogeneity. As a result, we propose a simple method named FedLoop. This method uses a loop topology structure, eliminating the need for a central server or data exchanges between participants, thereby safeguarding privacy. Within FedLoop, clients act as nodes in a loop. The training process for each node consists of two phases: an initial phase solely for the personalized layers and a subsequent phase dedicated to the training of all layers. This looping process continues until a set round limit is achieved. Experimental findings reveal that FedLoop outperforms the existing state-of-the-art algorithm, FedALA. FedLoop effectively addresses challenges posed by data heterogeneity and its rapid convergence significantly cuts down communication overheads in federated learning.
Date of Conference: 05-08 December 2023
Date Added to IEEE Xplore: 01 January 2024
ISBN Information: