Abstract:
|
Measuring conditional independence is one of the important tasks in statistical inference, which is fundamental in many fields such as graphical models. In this work, we explore the connection between conditional distance statistics in Euclidean space and reproducing kernel Hilbert space (RKHS) based statistics used in the machine learning community. By using the distance-induced kernel proposed by Sejdinovic et al. (2013), we obtained a corresponding kernel statistics for conditional distance correlation proposed by Wang et al. (2013), which is known to be able to characterize conditional independence. However, it is worth noticing that the corresponding kernel statistics does not coincide with the statistic proposed by Fukumizu et al. (2004), which is widely used in the RKHS community. We show that this kernel statistic is just a weighted average of the inner product between the differences of joint characteristic function to the product of marginal characteristic functions. Therefore, we may view conditional distance correlation as a member of a much larger class of conditional independence measures and may be able to design more powerful test.
|