Abstract:
Personalized federated learning (PFL) is a subfield of federated learning. Contrary to conventional federated learning that expects to find a general global model, PFL ge...Show MoreMetadata
Abstract:
Personalized federated learning (PFL) is a subfield of federated learning. Contrary to conventional federated learning that expects to find a general global model, PFL generates a personalized model adapted to its local data distribution for each client. Some existing PFL methods only consider improving the client-side personalization ability, discarding the server-side generalization capacity. To address this issue, we propose a fine-tuning and head aggregation method in federated learning (FedFTHA). It allows each client to maintain a personalized model head and fine-tune it after each local update to generate a local model containing the personalized head. Specifically, during FedFTHA training, these personalized heads are aggregated to generate a generalized head for the global model. FedFTHA meets the needs of both client-side model personalization and server-side model generalization. In addition, a universal optimization framework is employed to prove its convergence under convex and nonconvex conditions. We verify the personalization ability and generalization performance of FedFTHA under heterogeneous settings with benchmark data sets. The comparative analysis authenticates the significance of our proposal.
Published in: IEEE Internet of Things Journal ( Volume: 10, Issue: 14, 15 July 2023)