Online Program Home
My Program

Abstract Details

Activity Number: 303 - Statistical Association and High-Dimensional Data
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Nonparametric Statistics
Abstract #307229 Presentation
Title: Estimating Conditional Mutual Information for Discrete and Continuous Random Variables
Author(s): Octavio Mesner* and Cosma Shalizi and Larry Wasserman
Companies: Carnegie Mellon University and Carnegie Mellon University and Carnegie Mellon University
Keywords: Conditional mutual information; discrete/continuous; information theory
Abstract:

Mutual information is an attractive statistic for many applications because it completely captures the dependence between two random variables or vectors. Conditional mutual information (CMI) is particularly useful in settings, e.g. causal discovery, where it is necessary to quantify dependence between a pair of variables that may be mediated by other variables. CMI’s usage is rare in fields such as epidemiology, public policy, and social sciences due to its inability to handle mixtures of continuous and discrete random variables. While progress has been made on estimating mutual information for discrete and continuous variables, a CMI estimation method does not currently exist. This paper builds on prior research to develop a novel method for non-parametric CMI estimation for discrete and/or continuous variables. For each point, the method locally estimates CMI using its nearest neighbors then averages all local estimates. If a point’s nearest neighbor occupies that same location, the method recognizes that the point is likely discrete and alters the counting process. We prove that this estimator is consistent theoretically and demonstrate its performance empirically as well.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program