Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 206 - Machine Learning Methodology
Type: Contributed
Date/Time: Tuesday, August 4, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #313096
Title: Application of Stochastic Gradient Descent in Parameter Estimation for Models with Spatial Correlation
Author(s): Gan Luan*
Companies: New Jersey Institute of Tech
Keywords: Stochastic gradient descent; Spatial data; Bootstrap; Resampling method; Inference; Large-scale data
Abstract:

Many data contain spatial components, and it is important to consider spatial correlation in modeling and parameter estimation. Stochastic gradient descent (SGD) is a desirable method for model parameter estimation in large-scale data and online learning settings, since it goes through the data in only one pass. Although many studies regarding SGD have been conducted, application of SGD for spatial models is still not common. In this talk, I consider spatial lattice data and the simultaneous autoregressive model (SAR) and use averaged SGD for model parameter estimation. Also, an online bootstrap procedure is used to conduct inferences based on SGD estimator. This inference procedure updates SGD estimates, and at the same time generates many randomly perturbed SGD estimates for each observation. These perturbed estimates can be used to produce confidence intervals. I will present results of simulation studies and the asymptotic properties of these procedures. Lastly, I will apply the proposed method to study covariates that affect the charge-to-payment ratios based on a real data, Physician and Other Supplier Public Use File (PUF).


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program