Abstract:
|
With the growing capabilities of Geographic Information Systems (GIS) and reated software, statisticians today routinely encounter spatial data containing observations from a massive number of locations and time points. Important areas of application include environmental exposure assessment and construction of risk maps based upon massive amounts of spatiotemporal data. Spatiotemporal process models have been, and continue to be, widely deployed by researchers to better understand the complex nature of spatial and temporal variability. However, fitting hierarchical spatiotemporal models is computationally onerous with complexity increasing in cubic order for the number of spatial locations and temporal points. Massively scalable Gaussian process models, such as the Nearest-Neighbor Gaussian Process (NNGP), that can be estimated using algorithms requiring floating point operations (flops) and storage linear in the number of spatiotemporal points. This talk will focus upon a variety of modeling and computational strategies to implement massively scalable Gaussian process models for Bayesian inference in settings involving over 6 million locations.
|