Loading [a11y]/accessibility-menu.js
Network Dissensus via Distributed ADMM | IEEE Journals & Magazine | IEEE Xplore

Network Dissensus via Distributed ADMM


Abstract:

As machine learning implementations move towards the network edge, distributed optimization algorithms are becoming increasingly relevant. Common approaches build upon th...Show More

Abstract:

As machine learning implementations move towards the network edge, distributed optimization algorithms are becoming increasingly relevant. Common approaches build upon the consensus framework to enforce cooperation and consistency among the nodes. In many applications however, the inter-node relationships are better modeled via antagonistic or dissensual constraints. These relationships can generally be incorporated via non-convex constraints or penalty functions, and the resulting formulations are flexible enough to subsume a wide variety of classification and discrimination problems. This work develops a general-purpose ADMM algorithm for distributed optimization with dissensus constraints. The formulation is generalized to incorporate both consensus and dissensus relationships. The non-convex constraints are handled via appropriate first-order approximations and convergence to a stationary point is established. The proposed algorithm is tested on various synthetic as well as real-world data sets. In particular, we focus on the discriminative dictionary learning problem, where the goal is to learn class-specific dictionaries usable for both reconstruction and discrimination tasks. Extensive tests over seizure detection, activity recognition, and indoor localization datasets demonstrate the efficacy of the proposed approach.
Published in: IEEE Transactions on Signal Processing ( Volume: 68)
Page(s): 2287 - 2301
Date of Publication: 03 April 2020

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.