Loading [a11y]/accessibility-menu.js
GNSD: a Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization | IEEE Conference Publication | IEEE Xplore

GNSD: a Gradient-Tracking Based Nonconvex Stochastic Algorithm for Decentralized Optimization


Abstract:

In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale ...Show More

Abstract:

In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate O(1/√T) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.
Date of Conference: 02-05 June 2019
Date Added to IEEE Xplore: 08 July 2019
ISBN Information:
Conference Location: Minneapolis, MN, USA

References

References is not available for this document.