Loading [a11y]/accessibility-menu.js
Discrepancy-Aware Federated Learning for Non-IID Data | IEEE Conference Publication | IEEE Xplore

Discrepancy-Aware Federated Learning for Non-IID Data


Abstract:

Federated learning (FL) as an emerging edge intelligence paradigm allows clients to jointly train a model without exchanging raw data. Due to its excellent performance in...Show More

Abstract:

Federated learning (FL) as an emerging edge intelligence paradigm allows clients to jointly train a model without exchanging raw data. Due to its excellent performance in privacy protection, FL has practical application in many areas. However, data heterogeneity across clients is a prevalent phenomenon and poses a significant challenge to FL. Although many FL algorithms have been proposed to address the issue of performance deterioration under non-independent and identically distributed (Non-IID), the improvement in performance is not significant. In this paper, we propose knowledge discrepancy-aware federated learning (KDAFL). It evaluates the local model with regard to each class and overall learning effect based on the discrepancy between local and global knowledge. In this way, each client assigns a new weight to each category of cross-entropy and decides if knowledge distillation is to be conducted. The awareness of discrepancy allows the client to adjust the local training according to the characteristics of the knowledge it learned, thus better solving the Non-IID issue. Extensive experiments have demonstrated the effectiveness of KDAFL, particularly in terms of the improvement of global model accuracy compared to other state-of-the-art algorithms.
Date of Conference: 26-29 March 2023
Date Added to IEEE Xplore: 12 May 2023
ISBN Information:

ISSN Information:

Conference Location: Glasgow, United Kingdom

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.