Towards Data-Independent Knowledge Transfer in Model-Heterogeneous Federated Learning | IEEE Journals & Magazine | IEEE Xplore

Towards Data-Independent Knowledge Transfer in Model-Heterogeneous Federated Learning


Abstract:

Federated Distillation (FD) extends classic Federated Learning (FL) to a more general training framework that enables model-heterogeneous collaborative learning by Knowle...Show More

Abstract:

Federated Distillation (FD) extends classic Federated Learning (FL) to a more general training framework that enables model-heterogeneous collaborative learning by Knowledge Distillation (KD) across multiple clients and the server. However, existing KD-based algorithms usually require a set of shared input samples for each client to produce soft-prediction for distillation. Worse still, such a manual selection is accompanied by careful deliberations or prior information on clients’ private data distribution, which is not in line with the privacy-preserving characteristic of classic FL. In this paper, we propose a novel training framework to achieve data-independent knowledge transfer by properly designing a distributed generative adversarial network (GAN) between the server and clients that can synthesize shared feature representations to facilitate the FD training. Specifically, we deploy a generator on the server and reuse each local model as a federated discriminator to form a lightweight efficient distributed GAN that can automatically synthesize simulated global feature representations for distillation. Moreover, since the synthesized feature representations are usually more faithful and homologous with global data distribution, faster and better training convergence can be obtained. Extensive experiments on different tasks and heterogeneous models demonstrate the effectiveness of the proposed framework on model accuracy and communication overhead.
Published in: IEEE Transactions on Computers ( Volume: 72, Issue: 10, October 2023)
Page(s): 2888 - 2901
Date of Publication: 03 May 2023

ISSN Information:

Funding Agency:


References

References is not available for this document.