Online Program Home
My Program

Abstract Details

Activity Number: 660 - Machine Learning: Advances and Applications
Type: Contributed
Date/Time: Thursday, August 1, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #306848
Title: Regularized High-Dimensional Low Tubal Rank Tensor Regression and Its Applications
Author(s): Samrat Roy* and George Michailidis
Companies: University of Florida and University of Florida
Keywords: Tensor Regression; Tubal Rank; , f-diagonal tensor; Tensor Singular Value Decomposition; Slice; CP Decomposition
Abstract:

In the era of High Dimensional Statistics, researchers often need to cope with the covariates in the form of Multidimensional Arrays (Tensors). In this paper, we propose a new Linear Tensor Regression Model with Scalar Response, Tensor Covariate and Tensor Coefficient, where, with novelty, we assume that the coefficient-tensor is composed of a Low Tubal Rank Tensor L* and a Column-wise Sparse Tensor S*. Tubal Rank of a tensor is the number of non-zero tubes in the f-diagonal tensor of the Tensor Singular Value Decomposition. Low value of the Tubal Rank characterizes inter-slice low rank pattern in a tensor. In the Tensor Regression framework, this approach of imposing low rank structure on the coefficient-tensor is quite distinct from the one by CP Decomposition and often more explicable too. Our regularized least square program employs Nuclear Norm and Group Lasso penalty on the Block Circulant Matrix of L* and the slices of S* respectively. We develop an Alternating Minimization Algorithm and derive some useful properties of the estimates after addressing the Non-Identifiability of L* and S*. We demonstrate the efficacy of our postulated model on both synthetic and real data.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program