Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 204 - Statistical Computing by Deep Learning and Penalization
Type: Contributed
Date/Time: Monday, August 8, 2022 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Computing
Abstract #322567
Title: Efficient Large-Scale Nonstationary Spatial Covariance Function Estimation Using Convolutional Neural Networks
Author(s): Pratik Nag* and Sameh Abdulah and Yiping Hong and Marc Genton and Ying Sun and Ghulam Qadir
Companies: King Abdullah University of Science and Technology and KAUST and King Abdullah University of Science and Technology and KAUST and KAUST and Heidelberg Institute of Theoretical Studies
Keywords: Deep Learning; High performance computing; Large datasets; Nonstationary Matern covariance function; Convolutional neural networks; Spatial modeling
Abstract:

Spatial processes observed in many applications, such as climate and environmental science, are often large-scale and exhibit spatial nonstationarity. Gaussian processes are widely used in spatial statistics to model such nonstationarity by specifying a nonstationary covariance function, such as the nonstationary Matern covariance. In literature, existing work relies on spatial region partitions to estimate the spatially varying parameters in the covariance function. Although the choice of partitions is a key factor, it is typically subjective and not data-driven. In this work, we exploit the capabilities of the Convolutional Neural Networks (CNNs) to perform dynamic splitting to the nonstationary spatial region. This dynamic splitting can identify the nonstationary subregions after the first split and recursively resplit them until all the spatial subregions behave close to stationary. We also provide a parallel high-performance implementation of the nonstationary modeling and predictions on most recent hardware architectures, including shared memory, GPUs, and distributed memory systems. Our proposed approach shows better accuracy and performance than the traditional methods.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program