Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 134 - Bayesian Modeling
Type: Contributed
Date/Time: Monday, August 9, 2021 : 1:30 PM to 3:20 PM
Sponsor: Section on Statistical Computing
Abstract #318848
Title: Training Graph Convolutional Networks with Fast Monte Carlo
Author(s): Tianning Xu* and Yifan Chen
Companies: University of Illinois Urbana-Champaign and University of Illinois at Urbana-Champaign
Keywords: graph convolutional networks; layer-wise sampling; sketching; Mente Carlo
Abstract:

The graph convolutional networks (GCNs) have recently achieved great success in different graph tasks. However, training GCNs for a large graph is computationally intensive. ¬ Full-batch GCN training recursively involves the neighbors of nodes in each GCN layer. Since the nodes are dependent, the linear growth of GCN depth can lead to the exponential growth of neighbors. To address the huge computational cost, sampling-based methods are proposed. Among them, the subgraph sampling method is sensitive to the graph structure; node-wise sampling still suffers from the exponential growth neighbor size; while layer-wise sampling address the neighbor explosion issue by layer-wise important sampling. We apply sketching as a layer-wise sampling method. The accuracy of sketching-GCN is comparable to the original full-batch GCN while our method is more efficient in both time and memory. Furthermore, existing sampling strategies including FastGCN and LADIES can be viewed as special cases of the sketching framework.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program