Online Program Home
My Program

Abstract Details

Activity Number: 256 - Contributed Poster Presentations: Section on Statistical Learning and Data Science
Type: Contributed
Date/Time: Monday, July 29, 2019 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #306924
Title: Using Push-Forward and Pullback Measures for Parameter Identification and Distribution Estimation
Author(s): Tian Yu Yen* and Michael Pilosov
Companies: University of Colorado At Denver and University of Colorado At Denver
Keywords: parameter estimation; penalized regression; inverse problems; data consistent inversion; random effects; posterior distribution

Data Consistent Inversion is a new method for estimating the probability distribution of input parameters for physical models that are consistent with observed data. While similar to Bayesian inference methods, Data Consistent Inversion is unique in the way it utilizes discrepancies between model predictions (push-forward measures) and model observations to produce an update (using pullback measures) on initial parameter descriptions. Some benefits of this method include its robustness to different assumptions about the distribution of model errors, its flexibility in modeling different sources of variation (sampling or other measurement errors), and its ability to produce predictive samples consistent with observed data even when physical models are nonlinear in the input parameters. We discuss the general framework of the method and its connections to penalized regression as well as non-parametric Bayesian methods.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program