Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 496 - Dimension Reduction
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #312201
Title: Bayesian Model Averaging Sufficient Dimension Reduction
Author(s): Michael Declan Power* and Yuexiao Dong
Companies: Temple University and Temple University
Keywords: Bayesian model averaging; principal Hessian directions; sliced inverse regression; sufficient dimension reduction
Abstract:

As a classical sufficient dimension reduction method, sliced inverse regression (SIR) (Li, 1991) replaces the original predictors by their low-dimensional linear combinations while preserving all the relevant information between the response and the predictors. After partitioning the response into different slices, the intraslice means of the predictors are used to recover such linear combinations. However, these linear combinations involve all the predictor variables. To address this limitation, we propose two Bayesian model averaging (BMA) approaches to achieve sparse sufficient dimension reduction. While the first approach leverages univariate response BMA (Raftery et al., 1997) and deals with each slice separately, our second approach uses multivariate response BMA (Brown et al., 1998) to estimate the intraslice means jointly. Through extensive simulation studies, the new proposals are shown to outperform SIR and existing sparse sufficient dimension reduction methods such as sparse SIR (Li, 2007), coordinate-independent sparse estimation (Chen et al., 2010), and frequentist model averaging SIR (Fang and Yu, 2020).


Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program