Loading [a11y]/accessibility-menu.js
Communication-Adaptive Stochastic Gradient Methods for Distributed Learning | IEEE Journals & Magazine | IEEE Xplore

Communication-Adaptive Stochastic Gradient Methods for Distributed Learning


Abstract:

This paper targets developing algorithms for solving distributed learning problems in a communication-efficient fashion, by generalizing the recent method of lazily aggre...Show More

Abstract:

This paper targets developing algorithms for solving distributed learning problems in a communication-efficient fashion, by generalizing the recent method of lazily aggregated gradient (LAG) to deal with stochastic gradient - justifying the name of the new method LASG. While LAG is effective at reducing communication without sacrificing the rate of convergence, we show it only works with deterministic gradients. We introduce new rules and analysis for LASG that are tailored for stochastic gradients, so it effectively saves downloads, uploads, or both for distributed stochastic gradient descent. LASG achieves impressive empirical performance - it typically saves total communication by an order of magnitude. LASG can be used together with gradient quantization to bring more savings.
Published in: IEEE Transactions on Signal Processing ( Volume: 69)
Page(s): 4637 - 4651
Date of Publication: 27 July 2021

ISSN Information:

Funding Agency:


References

References is not available for this document.