Online Program Home
My Program

Abstract Details

Activity Number: 613 - Robust Learning and Posterior Summary
Type: Contributed
Date/Time: Thursday, August 1, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Bayesian Statistical Science
Abstract #305236 Presentation 1 Presentation 2
Title: Interpretable Posterior Summaries Using the Wasserstein Distance
Author(s): Eric Arthur Dunipace* and Lorenzo Trippa
Companies: Harvard TH Chan School of Public Health and Dana-Farber Cancer Institute
Keywords: interpretable ; wasserstein; bayesian estimation; cancer; environmental health; variable selection

In the current computing age, models can often have hundreds or even thousands of parameters. With these large models comes the risk of losing the ability to communicate and understand the precise meaning of the individual parameters in a model. In a frequentist setting, one can use an L1 penalty to reduce the number of parameters in a model but similar methods have not been developed for Bayesian settings where the quantity of interest involves an integral over the posterior distribution. We introduce a new method using a penalized 2-Wasserstein distance to reduce the dimensionality of the parameter space while still obtaining a distribution over the remaining dimensions. Our method allows users to select a budget on how many parameters they wish to understand, interpret, and communicate to an audience, and, in a data dependent way, select a reduced posterior of the selected dimension that minimizes the distance to the full posterior. We provide simulation results comparing performance to other posterior summary methods and apply the method to cancer and environmental health data.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program