Abstract:
|
Fast inference of numerical model parameters from data is an important prerequisite to generate predictive models for many applications. Use of sampling-based approaches may become intractable when likelihood evaluation is computationally expensive. New approaches combining variational inference with normalizing flow, characterized by a linear computational cost along with the dimensionality of the latent variable space, provides a more efficient approach for Bayesian inference about the model parameters. Moreover, replacing the true model with an offline trained surrogate model might generate significant bias when the surrogate is insufficiently accurate around the posterior modes. To reduce the computational cost without sacrificing inferential accuracy, we propose Normalizing Flow with Adaptive Surrogate (NoFAS), an optimization strategy that alternatively updates the normalizing flow parameters and the weights of a neural network surrogate model. We also propose an efficient sample weighting scheme for surrogate model training that ensures some global accuracy of the surrogate while capturing the likely regions of the parameters that yield the observed data.
|