Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues | IEEE Journals & Magazine | IEEE Xplore

Toward Secure and Robust Federated Distillation in Distributed Cloud: Challenges and Design Issues


Abstract:

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge com...Show More

Abstract:

Federated learning (FL) offers a promising solution for effectively leveraging the data scattered across the distributed cloud system. Despite its potential, the huge communication overhead greatly burdens the distributed cloud system. Federated distillation (FD) is a novel distributed learning technique with low communication cost, in which the clients communicate only the model logits rather than the model parameters. However, FD faces challenges related to data heterogeneity and security. Additionally, the conventional aggregation method in FD is vulnerable to malicious uploads. In this article, we discuss the limitations of FL and the challenges of FD in the context of distributed cloud system. To address these issues, we propose a blockchain-based framework to achieve secure and robust FD. Specifically, we develop a pre-training data preparation method to reduce data distribution heterogeneity and an aggregation method to enhance the robustness of the aggregation process. Moreover, a committee/workers selection strategy is devised to optimize the task allocation among clients. Experimental evaluations are conducted to evaluate the effectiveness of the proposed framework.
Published in: IEEE Network ( Volume: 38, Issue: 4, July 2024)
Page(s): 151 - 157
Date of Publication: 23 February 2024

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.