Abstract:
|
We introduce a statistical model, called additively faithful directed acyclic graph (AFDAG), for causal learning from observational data. Our approach is based on additive conditional independence (ACI), a recently proposed three-way statistical relation that shares many similarities with conditional independence but without resorting to multivariate kernels. This distinct feature strikes a balance between a parametric model and a fully nonparametric model, which makes the proposed model attractive to large networks. For graph inference, we develop an estimator for AFDAG based on a linear operator that characterizes ACI, and establish the consistency and convergence rates of our estimator. Moreover, we prove the uniform consistency of the estimated DAG under a stronger additive faithfulness condition, which appears to be less restrictive than its linear counterpart. We introduce a modified PC-algorithm to implement the estimating procedures efficiently, so that their complexity is determined by the level of sparseness rather than the dimension of the network. The usefulness of AFDAG formulation is demonstrated through synthetic datasets and an application to a proteomics dataset.
|