Online Program Home
My Program

Abstract Details

Activity Number: 594 - Methods for Analysis of High-Dimensional Data
Type: Contributed
Date/Time: Wednesday, August 1, 2018 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #329623 Presentation
Title: Dimension Reduction of High-Dimensional Data Sets Based on Stepwise SVM
Author(s): Elizabeth Chou* and Tzu-Wei Ko
Companies: National Chengchi University and National Chengchi University
Keywords: Dimension reduction; Classification; High-Dimensional Data; Supervised learning
Abstract:

The current study proposes a dimension reduction method, stepwise support vector machine (SVM), to reduce the dimensions of large p small n datasets. The proposed method is compared with other dimension reduction methods, namely, the Pearson product difference correlation coefficient (PCCs), recursive feature elimination based on random forest (RF-RFE), and principal component analysis (PCA), by using five gene expression datasets. Additionally, the prediction performance of the variables selected by our method is evaluated. The study found that stepwise SVM can effectively select the important variables and achieve good prediction performance. Moreover, the predictions of stepwise SVM for reduced datasets was better than those for the unreduced datasets. The performance of stepwise SVM was more stable than that of PCA and RF-RFE, but the performance difference with respect to PCCs was minimal. It is necessary to reduce the dimensions of large p small n datasets. We believe that stepwise SVM can effectively eliminate noise in data and improve the prediction accuracy in any large p small n dataset.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2018 program