Online Program

Return to main conference page
Thursday, May 17
Machine Learning
Nonlinear Dimension Reduction
Thu, May 17, 3:30 PM - 5:00 PM
Regency Ballroom A
 

Optimality of the Johnson-Lindenstrauss Lemma (304334)

Kasper Green Larsen, Aarhus University 
*Jelani Nelson, Harvard University 

Keywords: Jonson-Lindenstrauss lemma, random projections, dimensionality reduction

Dimensionality reduction in Euclidean space, as attainable by the Johnson-Lindenstrauss lemma (also known as "random projections"), has been a fundamental tool in algorithm design and machine learning. The JL lemma states that any n points in Euclidean space can be mapped to m-dimensional Euclidean space while preserving all pairwise distances up to 1+epsilon, where m only needs to be on the order of (log n) / epsilon^2, independent of the original dimension. In this talk, I discuss our recent proof that the JL lemma is optimal, in the sense that for any n there are point sets of size n such that no embedding providing (1+epsilon)-distortion exists into a dimension that is more than a constant factor better than what the JL lemma guarantees. I will also discuss some subsequent work and future directions.

Joint work with Kasper Green Larsen (Aarhus University).