Online Program Home
My Program

Abstract Details

Activity Number: 294 - SPEED: Statistical Learning and Data Science Speed Session 2, Part 1
Type: Contributed
Date/Time: Tuesday, July 30, 2019 : 8:30 AM to 10:20 AM
Sponsor: Section on Statistical Learning and Data Science
Abstract #304646 Presentation
Title: Statistical Optimality of Interpolated Nearest Neighbor Algorithms
Author(s): Yue Xing* and Qifan Song and Guang Cheng
Companies: Purdue University and Purdue University and Purdue Statistics
Keywords: Deep Learning; Interpolation; Nearest Neighbor Algorithm

In the era of deep learning, understanding over-fitting phenomenon becomes increasingly important. It is observed that carefully designed deep neural networks achieve small testing error even when the training error is close to zero. One possible explanation is that for many modern machine learning algorithms, over-fitting can greatly reduce the estimation bias, while not increasing the estimation variance too much. To illustrate the above idea, we prove that the proposed interpolated nearest neighbor algorithm achieves the minimax optimal rate in both regression and classification regimes, and observe that they are empirically better than the traditional k nearest neighbor method in some cases. Furthermore, the empirical advantage is theoretically justified by a smaller multiplicative constant before the minimax rate.

Authors who are presenting talks have a * after their name.

Back to the full JSM 2019 program