Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 498 - Modern Machine Learning
Type: Contributed
Date/Time: Thursday, August 6, 2020 : 10:00 AM to 2:00 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #310946
Title: An Optimal Statistical and Computational Framework for Generalized Tensor Estimation
Author(s): Rungang Han* and Rebecca Willett and Anru Zhang
Companies: University of Wisconsin-Madison and University of Chicago and University of Wisconsin-Madison
Keywords: General Tensor Estimation; Gradient Descent; Mini-max Optimality; Non-convex Optimization; Imaging Denoising

The analysis of tensor data has become an active research topic in statistics and data science recently and this paper describes a flexible framework for generalized low-rank tensor estimation problems, which includes many important instances arising from applications in computational imaging, genomics, and network analysis. The proposed estimator consists of finding a low-rank tensor fit to the data under generalized parametric models. To overcome the difficulty of non-convexity in these problems, we introduce a unified approach of projected regularized gradient descent that adapts to the underlying low-rank structure. Under mild conditions on the loss function, we establish both an upper bound on statistical error and the linear rate of computational convergence through a general deterministic analysis. Then we further consider a suite of generalized tensor estimation problems, including sub-Gaussian tensor denoising, tensor regression and Poisson tensor PCA. We prove that the proposed algorithm achieves minimax optimal rate of convergence of the estimation error. The superiority of the proposed framework is demonstrated via extensive experiments on both simulated and real data.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program