Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 416 - SLDS CSpeed 7
Type: Contributed
Date/Time: Thursday, August 12, 2021 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #318610
Title: Computationally Sufficient Reductions for Some Sparse Multi-Way and Matrix-Variate Estimators
Author(s): Prateek Sasan* and Akshay Prasadan and Vincent Q Vu
Companies: The Ohio State University and Carnegie Mellon University and The Ohio State University
Keywords: sufficiency; sparsity; tensors; matrix decomposition; graphical models
Abstract:

We apply a recently proposed theory of computational sufficiency for expo-fam estimators with invariant generators (Vu, 2018) to large classes of sparse multi-dimensional matrix-variate estimators. These classes include a convex relaxation of sparse SVD, the bigraphical lasso, and tensor graphical lasso. This provides both computational and methodological insights. On the methodological front, We show that these estimators share an exact reduction by generalized single linkage thresholding operators. For example, one consequence is that these procedures share a common set of knots in their regularization paths. On the computational front, our results generalize the exact thresholding phenomenon for the graphical lasso (Witten et al. 2012, Mazumder & Hastie, 2012). We show how to efficiently reduce the feasible set for the optimization problem to enable faster algorithms, and demonstrate manyfold reduction in the runtime by employing these insights.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program