Abstract:
|
Sampling from high-dimensional distributions is a fundamental problem in statistical research and practice, and has become a central task in many machine learning models such as energy-based models and deep generative models. However, great challenges emerge when the density function contains multiple modes that are isolated with each other. We tackle this difficulty by fitting an invertible transformation function based on normalizing flow techniques, such that the original distribution is warped into a new one that is much easier to sample from. To address the multi-modality issue, our method adaptively learns a sequence of tempered distributions, which we term as a tempered distribution flow, to progressively approach the desired distribution. Numerical experiments demonstrate the superior performance of this novel sampler compared to traditional methods.
|