Abstract:
Federated Learning has attracted widespread attention in recent years because it allows massive edge nodes to collaboratively train machine learning models without sharin...Show MoreMetadata
Abstract:
Federated Learning has attracted widespread attention in recent years because it allows massive edge nodes to collaboratively train machine learning models without sharing their private data sets. However, these edge nodes are usually heterogeneous in computational capability and statistically different in data distribution, i.e., non-independent and identically distributed (IID), leading to significant performance degradation. Although status quo asynchronous training methods can solve the heterogeneity issue, they cannot prevent the non-IID problem from reducing the convergence rate. In this article, we propose a novel paradigm that schedules the gradient with partially averaged gradients and applies the global momentum (GSGM) for asynchronous training over non-IID data sets in an edge environment. Our key idea is to apply global momentum and partial average on the biased gradients calculated on edge nodes after scheduling, to make the training process stable. Empirical results demonstrate that GSGM can well adapt to different degrees of non-IID data and bring 20% performance gains in terms of training stability for popular optimization algorithms with enhanced accuracy over Fashion-Mnist and CIFAR-10 data sets.
Published in: IEEE Internet of Things Journal ( Volume: 9, Issue: 19, 01 October 2022)