Activity Number:
|
321
- Machine Learning and Variable Selection
|
Type:
|
Contributed
|
Date/Time:
|
Wednesday, August 11, 2021 : 3:30 PM to 5:20 PM
|
Sponsor:
|
Section on Statistical Computing
|
Abstract #318418
|
|
Title:
|
Lasso-Regularized Local Smoothing for Longitudinal Analysis with Time-Varying Coefficient Models
|
Author(s):
|
Xiaoyang Ma* and Colin O Wu and Xin Tian
|
Companies:
|
NHLBI, NIH; Georgetown University and National Heart, Lung and Blood Institute, National Institutes of Health and National Heart, Lung and Blood Institute, National Institutes of Health
|
Keywords:
|
High-dimensional longitudinal data;
Lasso-regularized smoothing;
Locally influential covariate;
Longitudinal analysis;
Statistical machine learning;
Time-varying coefficient model
|
Abstract:
|
We consider selecting locally influential variables and estimating the covariate effects in time-varying coefficient models (TVCM) with high-dimensional longitudinal data. To capture the dynamic covariate effects of the models, we propose a Lasso-regularized kernel-based local polynomial smoothing method that is able to select the locally influential covariates at the specific time range and estimate the local covariate effects. Our approach extends the local smoothing for TVCM to high dimensional longitudinal data and is an alternative to the regularized spline methods studied by Wang, Li and Huang (JASA, 2008, 103:1556-1569) and Xue et al. (Stat. in Med, 2020, 39:156-170). Through an application to a large epidemiological study, we demonstrate that our regularized local smoothing method has advantages over the regularized spline methods for being computationally simple and having straightforward clinical interpretations. Our simulation studies suggest that the proposed method is capable of identifying “true” locally influential predictors at different time ranges. This method provides a useful dynamic model-based tool for statistical machine learning with longitudinal data.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2021 program
|