Online Program Home
My Program

Abstract Details

Activity Number: 131
Type: Contributed
Date/Time: Monday, August 1, 2016 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #321067
Title: Learning Network Dynamics via Regularized Tensor Decomposition
Author(s): Yun-Jhong Wu* and Elizaveta Levina and Ji Zhu
Companies: University of Michigan and University of Michigan and University of Michigan
Keywords: dynamic networks ; non-negative tensor decomposition ; under-complete tensor representation ; power method
Abstract:

Real networks often evolve over time, and interactions between nodes in networks are usually observed only at certain specific time points. In this work, we consider network data with time-stamped links. We propose to model such a dynamic network using a low rank tensor representation. This model characterizes time trends of multiple rank-1 factors and can be used to approximate more complicate networks. We develop an approach to fit this model based on a tensor completion algorithm and a smoothness penalty in the time domain, implemented with a highly scalable power-iteration-based algorithm which can fit large sparse dynamic networks. The numerical experiments on simulated data as well as the Enron e-mail dataset demonstrate the potential of tensor methods for dynamic network data analysis.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2016 program

 
 
Copyright © American Statistical Association