Online Program

Return to main conference page
Thursday, May 30
Computational Statistics
Recent Developments in Lower Rank Learning for Complex Data
Thu, May 30, 10:30 AM - 12:05 PM
Grand Ballroom K

MCMC for Dempster-Shafer Statistical Inference (305044)

*Ruobin Gong, Rutgers University 

Keywords: MCMC, Dempster-Shafer, belief function, Bayesian inference

We discuss a re-introduction of MCMC methods to statistical inference with Dempster-Shafer (DS) models. DS models embody posterior inference with a belief function, a generalized notion of probability function, and is dual in nature both as a set of probabilities and a probability over subsets of the state space. DS models supply an expressive vocabulary for low-resolution information, including coarse data, lacking priors, and deficient model structure. They also put forward demanding computational hurdles. In 60 years of existence, DS have not met large or even moderate-sized statistical applications. Advancements in MCMC in the 1990s propelled Bayesianism to a world of sophisticated machinery, but for DS, that was missed opportunity.

Utilizing constructive features of belief function as a multi-valued map, we supply a general characterization of the DS model structure, which is distinctively low-resolution, and is amenable to classic sampling techniques such as Gibbs and SMC. We showcase this construction with a Poisson loglinear model. Algorithmic thinking offers a fresh perspective on the charm and challenges of statistical inference under low-resolution assumptions.