Loading [a11y]/accessibility-menu.js
Robust Federated Averaging via Outlier Pruning | IEEE Journals & Magazine | IEEE Xplore

Robust Federated Averaging via Outlier Pruning


Abstract:

Federated Averaging (FedAvg) is the baseline Federated Learning (FL) algorithm that applies the stochastic gradient descent for local model training and the arithmetic av...Show More

Abstract:

Federated Averaging (FedAvg) is the baseline Federated Learning (FL) algorithm that applies the stochastic gradient descent for local model training and the arithmetic averaging of the local models’ parameters for global model aggregation. Succeeding FL works commonly utilize the arithmetic averaging scheme of FedAvg for the aggregation. However, such arithmetic averaging is prone to the outlier model-updates, especially when the clients’ data are non-Independent and Identically Distributed (non-IID). As such, the classical aggregation approach suffers from the dominance of the outlier updates and, consequently, causes high communication costs towards producing a decent global model. In this letter, we propose a robust aggregation strategy to alleviate the above issues. In particular, we propose first pruning the node-wise outlier updates (weights) from the local trained models and then performing the aggregation on the selected effective weights-set at each node. We provide the theoretical result of our method and conduct extensive experiments on the MNIST, CIFAR-10, and Shakespeare datasets with IID and non-IID settings, which demonstrate that our aggregation approach outperforms the state-of-the-art methods in terms of communication speedup, test-set performance and training convergence.
Published in: IEEE Signal Processing Letters ( Volume: 29)
Page(s): 409 - 413
Date of Publication: 13 December 2021

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.