Analytic DAG Constraints for Differentiable DAG Learning
Jun 1, 2025·
,
,
,
,
,
,
,
,
·
1 min read
Zhen Zhang
Ignavier Ng
Dong Gong
Yuhang Liu
Mingming Gong
Biwei Huang
Kun Zhang
Anton Van Den Hengel
Javen Qinfeng Shi
Abstract
Recovering the underlying Directed Acyclic Graph (DAG) structures from observational data presents a formidable challenge, partly due to the combinatorial nature of the DAG-constrained optimization problem. Recently, researchers have identified gradient vanishing as one of the primary obstacles in differentiable DAG learning and have proposed several DAG constraints to mitigate this issue. By developing the necessary theory to establish a connection between analytic functions and DAG constraints, we demonstrate that analytic functions from the set $f(x) = c_0 + ∑_i=1^ınftyc_ix^i | ∀ i > 0, c_i > 0; r = łim_ii̊ghtarrow ınftyc_i/c_i+1 > 0$ can be employed to formulate effective DAG constraints. Furthermore, we establish that this set of functions is closed under several functional operators, including differentiation, summation, and multiplication. Consequently, these operators can be leveraged to create novel DAG constraints based on existing ones. Using these properties, we design a series of DAG constraints and develop an efficient algorithm to evaluate them. Experiments in various settings demonstrate that our DAG constraints outperform previous state-of-the-art comparators. Our implementation is available at https://github.com/zzhang1987/AnalyticDAGLearning.
Type
Publication
The Thirteenth International Conference on Learning Representations
Add the full text or supplementary notes for the publication here using Markdown formatting.