Online Program Home
My Program

Abstract Details

Activity Number: 378 - Wald Lecture II
Type: Invited
Date/Time: Tuesday, July 30, 2019 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract #300260 Presentation
Title: Wald II: Statistical Learning with Sparsity
Author(s): Trevor J. Hastie*
Companies: Stanford University
Keywords: lasso; model selection; convexity; path algorithm; imputation
Abstract:

This series of three talks takes us on a journey that starts with the introduction of the lasso in the 1990s, and brings us to date on some of the vast array of applications that have emerged.

I: We motivate the need for sparsity with wide data, and then chronicle the invention of lasso and the quest for good software. Several examples will be given, culminating with lasso models for polygenic traits using GWAS. We end with a survey of some active areas of research not covered in the remaining two talks.

II: Matrix completion re-emerged during the Netflix competition as a way to compute a low-rank SVD in the presence of missing data, and for imputing missing values. We discuss some algorithms and aspects of this problem, and illustrate its application in recommender systems and modeling sparse longitudinal multivariate data.

III: The graphical lasso builds sparse inverse covariance matrices to capture the conditional independencies in multivariate Gaussian data. We discuss this approach and extensions, and then illustrate its use for anomaly detection and imputation. We also discuss the group lasso, with applications in detecting interactions and additive model selection.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program