Online Program Home
My Program

Abstract Details

Activity Number: 141 - Statistical Understanding of Deep Learning
Type: Invited
Date/Time: Monday, July 29, 2019 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #308006 Presentation
Title: Data-Dependent Regularization and Generalization Bounds of Deep Neural Networks
Author(s): Tengyu Ma*
Companies: Stanford University

Existing Rademacher complexity bounds for neural networks rely only on norm control of the weight matrices and depend exponentially on depth via a product of the matrix norms. Lower bounds show that this exponential dependence on depth is unavoidable when no additional properties of the training data are considered. We suspect that this conundrum comes from the fact that these bounds depend on the training data only through the margin. In practice, many data-dependent techniques such as Batchnorm improve the generalization performance. We obtain tighter Rademacher complexity bounds by considering additional data-dependent properties of the network: the sizes of the hidden layers of the network, and the norms of the Jacobians of each layer with respect to the previous layers. Our bounds scale polynomially in depth when these empirical quantities are small, as is usually the case in practice. To obtain these bounds, we develop general tools for making a composition of functions Lipschitz by augmentation and then covering this augmented function. Inspired by our theory, we directly regularize the network's Jacobians during training and empirically demonstrate that this improves test performance.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program