JSM 2004 - Toronto

Abstract #300925

This is the preliminary program for the 2004 Joint Statistical Meetings in Toronto, Canada. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 7-10, 2004); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2004 Program page



Activity Number: 276
Type: Contributed
Date/Time: Tuesday, August 10, 2004 : 2:00 PM to 3:50 PM
Sponsor: General Methodology
Abstract - #300925
Title: Nonlinear Manifold Learning
Author(s): Alan J. Izenman*+
Companies: Temple University
Address: Dept.t of Statistics, Philadelphia, PA, 19122-6083,
Keywords: multivariate ; high dimensionality ; PCA ; machine-learning ; nonlinearity ; kernels
Abstract:

Classical multivariate analysis has focused mainly on linear methods for dimensionality reduction. Examples include principal components analysis, canonical variate analysis, and discriminant analysis. New techniques are now being introduced into the statistical literature and the machine-learning literature that attempt to generalize these linear methods for dimensionality reduction to nonlinear methods. One of the main aspects of such generalizations is that there are different strategies for viewing the presence of nonlinearity in high-dimensional space. Thus, we have recently seen a number of versions of "nonlinear" PCA, including polynomial PCA, principal curves and surfaces, auto-associative multilayer neural networks, smallest additive principal components, and kernel PCA. Other techniques for finding nonlinear structure in data are Isomap, local linear embedding, and Laplacian eigenmaps. We describe several of these methods and their connections to each other.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2004 program

JSM 2004 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2004