Abstract:
|
With technological innovations fueling the aggregation of ever larger data sets, statistical inferences are quickly becoming limited not by sample size but rather by systematic effects such as those arising from partially observed measurements, censored measurements, and nuisance parameters. Only by carefully modeling these effects can we take full advantage of the data: the frontiers of applied statistics requires not only big data but also big models and the algorithms that can fit them. One such algorithm is Hamiltonian Monte Carlo, which leverages the local curvature of the posterior distribution to admit full Bayesian inference that scales to the complex models of practical interest. In this talk I will discuss the theoretical foundations of Hamiltonian Monte Carlo, elucidating the nature of its scalable performance and stressing the properties critical to a robust implementation.
|
ASA Meetings Department
732 North Washington Street, Alexandria, VA 22314
(703) 684-1221 • meetings@amstat.org
Copyright © American Statistical Association.