Abstract:
|
Stacking is a widely used model averaging technique. Like many other ensemble methods, stacking is more effective when model predictive performance is heterogeneous in inputs, in which case we can further improve the stacked mixture with a hierarchical model. In this talk I will focus on the recent development of Bayesian hierarchical stacking: an approach that locally aggregates models. The weight is a function of data, partially-pooled, inferred using Bayesian inference, and can further incorporate other structured priors and complex data. More generally, the success of hierarchical stacking showcases the benefit of bringing a full Bayesian workflow into an otherwise black box algorithm.
|