Abstract:
|
Spatio-temporal datasets often have the characteristic of being relatively large, while requiring relatively large models to adequately describe them. For example the UK black smoke monitoring network produced daily particulate pollution measurements at each of a variable subset of over 2000 UK monitoring stations over 4 decades, resulting in some 10 million data. The spatio-temporal evolution of the underlying pollution field can be reasonably well captured by a generalized additive model (Gaussian latent process model) constructed from an additive decomposition of reduced rank tensor product spline smoothers, using Duchon splines as the spatial marginals. But the resulting models have around 10000 coefficients, and 2-30 smoothing parameters, which presents computational challenges. This talk will discuss general methods to overcome these challenges, in particular presenting novel scalable fitting iterations, combined with exploitation of particular forms of data sparsity.
|