Conference Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 408 - Recent Advances in Statistical Machine Learning
Type: Topic Contributed
Date/Time: Wednesday, August 10, 2022 : 10:30 AM to 12:20 PM
Sponsor: Section for Statistical Programmers and Analysts
Abstract #320826
Title: Locally Weighted Nearest Neighbor Classifier
Author(s): Guan Yu* and Xingye Qiao
Companies: University of Pittsburgh and Binghamton University
Keywords: Binary Classification; Non-parametric Classification; Excess Risk; Minimax Rate; Weighted Nearest Neighbor classifier
Abstract:

Weighted nearest neighbor (WNN) classifiers are fundamental non-parametric classifiers. There exists a vast room of flexibility in the choice of weights for the neighbors in a WNN classifier. In this talk, we will introduce a new locally weighted nearest neighbor (LWNN) classifier which adaptively assigns weights for different test data points. Given a training data set and a test data point x0, the weights for classifying x0 in LWNN are obtained by minimizing an upper bound of the estimation error of the regression function at x0. Similar to most other WNN classifiers, LWNN places larger weights on closer neighbors. However, in addition to the ranks of neighbors’ distances, the weights in LWNN also depend on the raw values of the distances. Our theoretical study shows that LWNN is minimax optimal when the marginal feature density is bounded away from zero. In the general case with an additional tail assumption on the feature density, the upper bound of the excess risk of LWNN matches the minimax lower bound up to a logarithmic term. Our numerical comparisons between LWNN and some competitors have further demonstrated its effectiveness.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2022 program