Loading [a11y]/accessibility-menu.js
Collaborative Neural Architecture Search for Personalized Federated Learning | IEEE Journals & Magazine | IEEE Xplore

Collaborative Neural Architecture Search for Personalized Federated Learning


Abstract:

Personalized federated learning (pFL) is a promising approach to train customized models for multiple clients over heterogeneous data distributions. However, existing wor...Show More

Abstract:

Personalized federated learning (pFL) is a promising approach to train customized models for multiple clients over heterogeneous data distributions. However, existing works on pFL often rely on the optimization of model parameters and ignore the personalization demand on neural network architecture, which can greatly affect the model performance in practice. Therefore, generating personalized models with different neural architectures for different clients is a key issue in implementing pFL in a heterogeneous environment. Motivated by Neural Architecture Search (NAS), a model architecture searching methodology, this paper aims to automate the model design in a collaborative manner while achieving good training performance for each client. Specifically, we reconstruct the centralized searching of NAS into the distributed scheme called Personalized Architecture Search (PAS), where differentiable architecture fine-tuning is achieved via gradient-descent optimization, thus making each client obtain the most appropriate model. Furthermore, to aggregate knowledge from heterogeneous neural architectures, a knowledge distillation-based training framework is proposed to achieve a good trade-off between generalization and personalization in federated learning. Extensive experiments demonstrate that our architecture-level personalization method achieves higher accuracy under the non-iid settings, while not aggravating model complexity over state-of-the-art benchmarks.
Published in: IEEE Transactions on Computers ( Volume: 74, Issue: 1, January 2025)
Page(s): 250 - 262
Date of Publication: 10 October 2024

ISSN Information:

Funding Agency:


References

References is not available for this document.