Abstract:
In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale ...Show MoreMetadata
Abstract:
In the era of big data, it is challenging to train a machine learning model on a single machine or over a distributed system with a central controller over a large-scale dataset. In this paper, we propose a gradient-tracking based nonconvex stochastic decentralized (GNSD) algorithm for solving nonconvex optimization problems, where the data is partitioned into multiple parts and processed by the local computational resource. Through exchanging the parameters at each node over a network, GNSD is able to find the first-order stationary points (FOSP) efficiently. From the theoretical analysis, it is guaranteed that the convergence rate of GNSD to FOSPs matches the well-known convergence rate O(1/√T) of stochastic gradient descent by shrinking the step-size. Finally, we perform extensive numerical experiments on computational clusters to demonstrate the advantage of GNSD compared with other state-of-the-art methods.
Published in: 2019 IEEE Data Science Workshop (DSW)
Date of Conference: 02-05 June 2019
Date Added to IEEE Xplore: 08 July 2019
ISBN Information: