Abstract:
Distributed aggregative optimization (DAO) is a special class of optimization problems of networking agents where the local objective function of each agent relies on the...Show MoreMetadata
Abstract:
Distributed aggregative optimization (DAO) is a special class of optimization problems of networking agents where the local objective function of each agent relies on the aggregation of other agents’ decisions as well as its own. It is widely known that convergence rate is one of the most important evaluation indexes for practical applications of DAO algorithms. However, it is challenging to achieve fast convergence rate for DAO algorithms over a directed graph, owing to the fact that the underlying interaction graph may be unbalanced, and the local objective functions of individual agent depend upon the decisions of other ones. To efficiently solve the DAO problem over directed graphs, a kind of accelerated distributed optimization algorithm with Nesterov momentum and constant step-size is designed and analyzed. To handle the effect of unbalanced property of the directed networks on solving the DAO problem, a consensus iteration law is embedded into the optimization algorithm to estimate the left Perron eigenvector of the weight matrix. Furthermore, a kind of distributed aggregative gradient tracking technology associated with the Nesterov momentum coefficient is developed and employed to construct the accelerated distributed aggregative algorithm. It is theoretically shown that the proposed optimization algorithm could yield a favorable linear convergence rate after limited iterative steps when the global objective function is \mu -strongly convex and L_{1} -smooth. At last, numerical experiments are provided to confirm the findings.
Published in: IEEE Transactions on Systems, Man, and Cybernetics: Systems ( Volume: 54, Issue: 7, July 2024)