Abstract:
|
We generalize the notion of distance covariance [Szekely et al. (2007)] to quantify joint dependence among d > = 2 random vectors of arbitrary dimensions. In this work, we first introduce the concept of high order distance covariance to measure the so-called Lancaster interactions. We then define the Joint Distance Covariance (Jdcov) as a linear combination of distance covariance and its higher order counterparts, which quantifies joint independence. In the population case, the value of Jdcov is zero if and only if the d random vectors are mutually independent. We study the asymptotic properties of empirical estimators constructed based on certain Euclidean distances between sample elements. We propose a bootstrap procedure for conducting a non-parametric hypothesis test for joint independence and show the consistency of the bootstrap test. The new metric is employed to check the goodness of fit for directed acyclic graphs in causal discovery, and the effectiveness of our method is illustrated via both simulated and real datasets.
|