Papers
arxiv:2203.08509

Differentiable DAG Sampling

Published on Mar 16, 2022
Authors:
,
,

Abstract

We propose a new differentiable probabilistic model over DAGs (DP-DAG). DP-DAG allows fast and differentiable DAG sampling suited to continuous optimization. To this end, DP-DAG samples a DAG by successively (1) sampling a linear ordering of the node and (2) sampling edges consistent with the sampled linear ordering. We further propose VI-<PRE_TAG>DP-DAG</POST_TAG>, a new method for DAG learning from observational data which combines DP-DAG with variational inference. Hence,VI-<PRE_TAG>DP-DAG</POST_TAG> approximates the posterior probability over DAG edges given the observed data. VI-<PRE_TAG>DP-DAG</POST_TAG> is guaranteed to output a valid DAG at any time during training and does not require any complex augmented Lagrangian optimization scheme in contrast to existing differentiable DAG learning approaches. In our extensive experiments, we compare VI-<PRE_TAG>DP-DAG</POST_TAG> to other differentiable DAG learning baselines on synthetic and real datasets. VI-<PRE_TAG>DP-DAG</POST_TAG> significantly improves DAG structure and causal mechanism learning while training faster than competitors.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2203.08509 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2203.08509 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2203.08509 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.