Abstract:
In federated learning (FL), local workers learn a global model collaboratively using their local data by communicating trained models to a central server for privacy conc...View moreMetadata
Abstract:
In federated learning (FL), local workers learn a global model collaboratively using their local data by communicating trained models to a central server for privacy concerns. Due to its local nature, FL is typically subject to various heterogeneities, including system and statistical heterogeneity. To address these concerns, Federated Proximal (FedProx) has been considered a promising FL paradigm to provide more stable learning convergence in the presence of computation stragglers and statistical heterogeneity. However, in wireless networks with unreliable communication channels, the errors of packet transmissions should be considered, introducing additional heterogeneity. For the first time, we rigorously prove the convergence of FedProx in the presence of transmission packet errors in heterogeneous networks. In addition, we propose a joint client selection and resource allocation strategy that maximizes the number of effective participating users for convergence acceleration. The method is combined with a random weight mechanism to reduce the statistical bias caused by the client selection strategy. An efficient low-complexity algorithm for solving the optimization problem is developed. The proposed method achieves faster convergence and requires fewer communication rounds to attain accuracy than existing state-of-the-art client selection methods.
Published in: IEEE Transactions on Wireless Communications ( Volume: 23, Issue: 4, April 2024)