Abstract:
|
In the recent literature, high-dimensional (HD) linear regression models which simultaneously adjust for and estimate the impact of all candidate predictor variables have been of interest. Most available approaches focus on independently and identically distributed (iid) observations, but it would be beneficial to use this model class for HD longitudinal/clustered datasets. Recently, a Bayesian approach to univariate HD linear regression via the expectation-maximization (EM) algorithm leverages this framework to provide estimates of regression parameters, predictions, and their standard errors. This approach differs from traditional HD Bayesian EM proposals by focusing on the posterior distributions of the prior means of the regression parameters, which are assumed to follow a two-groups model and are updated with each EM step. This approach assumes iid observations, which prevents its use for clustered/longitudinal HD data. We expand this method for linear mixed modeling (LMM), where we adapt the posterior distributions to incorporate random effects. We compare our approach to current Bayesian and frequentist penalized LMM techniques though simulation and real-world examples.
|