Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 197 - SPAAC Poster Competition
Type: Topic Contributed
Date/Time: Monday, August 8, 2022 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322136
Title: TNN: A Transfer Learning Classifier Based on Weighted Nearest Neighbors
Author(s): Haiyang Sheng* and Guan Yu
Companies: University at Buffalo and University of Pittsburgh
Keywords: Binary Classification; Minimax Optimal; Nearest Neighbor; Non-parametric Classification; Transfer Learning
Abstract:

Weighted nearest neighbors (WNN) classifiers are popular non-parametric classifiers. In many real applications, it could be difficult to obtain training samples from the distribution of interest. We propose a novel Transfer learning weighted Nearest Neighbors (TNN) classifier that could incorporate both the training samples from the distribution of interest and the supplementary training samples from a different but related distribution. As a WNN classifier, TNN determines the weights adaptively for each test sample by minimizing the worst-case upper bound on the conditional expectation of the estimation error of the regression function. It puts decreasing weights for more distant neighbors. To accommodate the difference between two distributions, TNN adds a non-negative offset distance to the training samples not from the distribution of interest, which tends to downgrade them. Our theoretical studies show that, under certain conditions, TNN is consistent and minimax optimal (up to a logarithmic term) in covariate shift. Under the posterior drift or more general transfer learning setting, the excess risk of TNN depends on the maximum discrepancy between the two distributions.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program