Loading [a11y]/accessibility-menu.js
Asynchronous Semi-Supervised Federated Learning with Provable Convergence in Edge Computing | IEEE Journals & Magazine | IEEE Xplore
Scheduled Maintenance: On Tuesday, 25 February, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (1800-2200 UTC). During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Asynchronous Semi-Supervised Federated Learning with Provable Convergence in Edge Computing


Abstract:

Traditional federated learning methods assume that users have fully labeled data in their device for training, but in practice, labels are difficult to obtain due to vari...Show More

Abstract:

Traditional federated learning methods assume that users have fully labeled data in their device for training, but in practice, labels are difficult to obtain due to various reasons such as user privacy concerns, high labeling costs, and lack of expertise. Semi-supervised learning has been introduced into federated learning scenarios to address the lack of labels, but performance suffers from slow training and non-convergence in real network environments. In this article, we propose Federated Incremental Learning (FedIL) as a semi-supervised federated learning (SSFL) framework in edge computing to overcome the limitations of SSFL. FedIL introduces a group-based asynchronous training algorithm with provable convergence, which accelerates model training by allowing more clients to participate simultaneously. We developed a prototype system and performed track-driven simulations to demonstrate FedIL's superior performance.
Published in: IEEE Network ( Volume: 36, Issue: 5, September/October 2022)
Page(s): 136 - 143
Date of Publication: 25 November 2022

ISSN Information:


References

References is not available for this document.