Online Program Home
  My Program

Abstract Details

Activity Number: 111 - New Dimension Reduction and Statistical Learning Methods for Biomedical Data
Type: Topic Contributed
Date/Time: Monday, July 31, 2017 : 8:30 AM to 10:20 AM
Sponsor: Biometrics Section
Abstract #324293
Title: Sparse Minimum Discrepancy Approach to Sufficient Dimension Reduction with Simultaneous Variable Selection in Ultrahigh Dimension
Author(s): Wei Qian* and Shanshan Ding and Dennis Cook
Companies: Rochester Institute of Technology and University of Delaware and University of Minnesota
Keywords: inverse regression ; central subspace ; sparsity ; sliced inverse regression ; principal fitted component ; sliced average variance estimation
Abstract:

Sufficient dimension reduction (SDR) is known to be a powerful tool for achieving data reduction and data visualization in regression and classification problems. In this work, we study ultrahigh-dimensional SDR problems and propose solutions under a unified minimum discrepancy approach with regularization. When p grows exponentially with n, consistency results in both central subspace estimation and variable selection are established simultaneously for important SDR methods, including sliced inverse regression (SIR), principal fitted component (PFC) and sliced average variance estimation (SAVE). Special sparse structures of large predictor or error covariance are also considered for potentially better performance. In addition, the proposed approach is equipped with a new algorithm to efficiently solve the regularized objective functions and a new data-driven procedure to determine structural dimension and tuning parameters, without the need to invert a large covariance matrix. Simulations and a real data analysis are offered to demonstrate the promise of our proposal in ultrahigh-dimensional settings.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association