Loading [MathJax]/extensions/MathMenu.js
Block Successive Convex Approximation for Concomitant Linear DAG Estimation | IEEE Conference Publication | IEEE Xplore

Block Successive Convex Approximation for Concomitant Linear DAG Estimation


Abstract:

We develop a novel continuous optimization algorithm to recover latent directed acyclic graphs (DAGs) from observational (and possibly heteroscedastic) data adhering to a...Show More

Abstract:

We develop a novel continuous optimization algorithm to recover latent directed acyclic graphs (DAGs) from observational (and possibly heteroscedastic) data adhering to a linear structural equation model (SEM). Our starting point is the recently proposed Concomitant Linear DAG Estimation (CoLiDE) framework, which advocates minimizing a sparsity-regularized convex score function augmented with a smooth, nonconvex acyclicity penalty. While prior work focused on score function design to jointly estimate DAG structure along with exogenous noise levels, optimization aspects were left unexplored. To bridge this gap, here we show that CoLiDE has a favorable structure amenable to optimization via a block successive convex approximation (BSCA) algorithm. We derive efficient, closed-form updates to refine the DAG adjacency matrix and noise variance estimates in a cyclic fashion. Although the acyclicity regularizer is devoid of a Lipschitz gradient and hence our approximation function is not a global upper bound of the original cost, a descent direction can be obtained via line search to yield a provably convergent sequence. Numerical tests showcase the superiority of the proposed BSCA iterations relative to the original (Adam-based) inexact block coordinate descent solver.
Date of Conference: 08-11 July 2024
Date Added to IEEE Xplore: 26 August 2024
ISBN Information:

ISSN Information:

Conference Location: Corvallis, OR, USA

Contact IEEE to Subscribe

References

References is not available for this document.