Online Program Home
My Program

Abstract Details

Activity Number: 176
Type: Contributed
Date/Time: Monday, August 1, 2016 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #320139 View Presentation
Title: Manifold Learning: Dimension Reduction Versus Parameterization Recovery
Author(s): Michael Trosset and Lijiang Guo*
Companies: Indiana University and Indiana University
Keywords: Isomap ; Hessian eigenmaps ; Riemannian manifolds ; geodesic distance ; local isometry ; Gaussian curvature

What does it mean to learn a manifold? Isomap, Locally Linear Embedding, and Laplacian eigenmaps were proposed as techniques for nonlinear dimension reduction, then Isomap was criticized for solving the parameterization recovery problem under less general conditions than Hessian eigenmaps. We observe that the class of manifolds for which parameterization recovery is possible is extremely small, consisting essentially of Swiss rolls. We then compare Isomap and Hessian eigenmaps with respect to other exploitation tasks that elucidate what manifold learning may-or may not-entail.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

Copyright © American Statistical Association