Abstract:
|
Many practical Bayesian inference problems fall into the "likelihood-free" setting, where evaluations of the likelihood function or prior density are unavailable or intractable; instead one can only simulate the associated distributions. I will discuss how transportation of measure can help solve such problems, by first "learning" a joint parameter-data prior from data and then constructing maps that push these prior samples to the desired conditional distribution. These methods have broad utility for inference in stochastic models, and even for data assimilation problems in geophysical applications. More generally, they enable uncertainty quantification for a variety of supervised learning tasks. Key issues in this construction center on: (1) the estimation of transport maps from few samples; and (2) parameterizations of monotone maps. I will discuss developments on both fronts, including some recent efforts in joint dimension reduction for conditional sampling in high dimensions.
This is joint work with Ricardo Baptista, Sven Wang, and Olivier Zahm.
|