Joint Consensus Matrix Design and Resource Allocation for Decentralized Learning | IEEE Conference Publication | IEEE Xplore

Joint Consensus Matrix Design and Resource Allocation for Decentralized Learning


Abstract:

In decentralized machine learning over a network of workers, each worker updates its local model as a weighted average of its local model and all models received from its...Show More

Abstract:

In decentralized machine learning over a network of workers, each worker updates its local model as a weighted average of its local model and all models received from its neighbors. Efficient consensus weight matrix design and communication resource allocation can increase the training convergence rate and reduce the wall-clock training time. In this paper, we jointly consider these two factors and propose a novel algorithm termed Communication-Efficient Network Topology (CENT), which reduces the latency in each training iteration by removing unnecessary communication links. CENT preserves the training convergence rate while enforcing communication graph sparsity and avoiding selecting poor communication links. Numerical study with real-world machine learning data demonstrates the efficacy of the proposed solution and its performance advantage over state-of-the-art algorithms.
Date of Conference: 13-16 June 2022
Date Added to IEEE Xplore: 22 July 2022
ISBN Information:
Electronic ISSN: 1861-2288
Conference Location: Catania, Italy

Contact IEEE to Subscribe

References

References is not available for this document.