JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 75
Type: Contributed
Date/Time: Sunday, July 29, 2012 : 4:00 PM to 5:50 PM
Sponsor: Section on Statistical Learning and Data Mining
Abstract - #304913
Title: Structure-Preserving Method for Dimension Reduction
Author(s): Ewa Nowakowska*+
Companies: Polish Academy of Sciences
Address: Jana Kazimierza 5, Warsaw, PL-01-248, Poland
Keywords: dimension reduction ; linear discriminant analysis ; principal component analysis ; clustering

Linear discriminant analysis (LDA) and principal component analysis (PCA) can be seen as alternative methods for dimension reduction. For k classes, the former returns a k-1 dimensional subspace S* which best discriminates given classes. The latter yields a subspace PC(k-1) spanned by k-1 eigenvectors corresponding to k-1 largest eigenvalues. As such, it grants largest overall variability but takes no partition into account. In general, the two subspaces may differ substantially. This work presents a method of preliminary data transformation which reduces dissimilarity between them without the knowledge of underlying data structure (classes). At the same time, it preserves initial distinctness of the classes' structure to a large extent. In the resulting subspace efficient clustering as well as other analyses of the unknown structure may be performed.

The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program

2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.