Online Program Home
My Program

Abstract Details

Activity Number: 452
Type: Contributed
Date/Time: Tuesday, August 2, 2016 : 3:05 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #321645
Title: Penalized Principal Logistic Regression for Sparse Sufficient Dimension Reduction
Author(s): Seung Jun Shin* and Andreas Artemiou
Companies: Korea University and Cardiff University
Keywords: Max-SCAD penalty ; Principal logistic regression ; Sparse sufficient dimension reduction

The sufficient dimension reduction (SDR) is a successive tool for reducing dimensionality of predictors by finding the central subspace, a minimal subspace of predictors that preserves all the information of responses. When the predictor dimension is large, it is desired to have a sparse representation of the basis of the central subspace in order to achieve variable selection and dimension reduction simultaneously. In this article, we propose a principal logistic regression (PLR) as an efficient SDR tool and further develop its penalized version to achieve the sparse SDR. Both simulation studies and real data analysis show promising performance of the proposed method compared to the existing ones.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association