Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 105 - Deep Learning and Statistical Modeling with Applications
Type: Invited
Date/Time: Monday, August 3, 2020 : 1:00 PM to 2:50 PM
Sponsor: International Chinese Statistical Association
Abstract #314447
Title: From Classical Statistics to Modern Machine Learning
Author(s): Mikhail Belkin*
Companies: Ohio State University

"A model with zero training error is overfit to the training data and will typically generalize poorly" goes statistical textbook wisdom. Yet, in modern practice over-parametrized deep networks with near perfect fit on training data still show excellent test performance. As I will discuss in the talk, this apparent contradiction is key to understanding the practice of modern deep learning. While classical methods rely on a trade-off balancing the complexity of predictors with training error, modern models are best described by interpolation, where a predictor is chosen among functions that fit the training data exactly, according to a certain (implicit or explicit) inductive bias. Furthermore, classical and modern models can be unified within a single "double descent" risk curve, which extends the classical U-shaped bias-variance curve beyond the point of interpolation. This understanding of model performance delineates the limits of the usual ''what you see is what you get" generalization bounds in machine learning and points to new analyses required to understand computational, statistical, and mathematical properties of modern models.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2020 program