Online Program Home
My Program

Abstract Details

Activity Number: 545 - Towards Perfect and Scalable Distributional Computation
Type: Invited
Date/Time: Wednesday, July 31, 2019 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #300488 Presentation
Title: The Never-Ending MCMC Revolution: Making Dempster-Shafer Modeling Practical
Author(s): Ruobin Gong* and Xiao-Li Meng
Companies: Rutgers University and Harvard University
Keywords: Dempster-Shafer theory; belief function; low-resolution inference; MCMC

We discuss a re-introduction of MCMC methods to statistical inference with Dempster-Shafer (DS) models. DS models embody posterior inference with a belief function, a generalized notion of probability function, and is dual in nature both as a set of probabilities and a probability over subsets of the state space. DS models supply an expressive vocabulary for low-resolution information, including coarse data, lacking priors, and deficient model structure. They also put forward demanding computational hurdles. In 60 years of existence, DS have not met large or even moderate-sized statistical applications. Advancements in MCMC in the 1990s propelled Bayesianism to a world of sophisticated machinery, but for DS, that was missed opportunity.

Utilizing constructive features of belief function as a multi-valued map, we supply a general characterization of the DS model structure, which is distinctively low-resolution, and is amenable to classic sampling techniques such as Gibbs and SMC. We showcase this construction with a Poisson loglinear model. Algorithmic thinking offers a fresh perspective on the charm and challenges of statistical inference under low-resolution assumptions.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program