JSM 2005 - Toronto

Abstract #302565

This is the preliminary program for the 2005 Joint Statistical Meetings in Minneapolis, Minnesota. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 7-10, 2005); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.



The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


The Program has labeled the meeting rooms with "letters" preceding the name of the room, designating in which facility the room is located:

Minneapolis Convention Center = “MCC” Hilton Minneapolis Hotel = “H” Hyatt Regency Minneapolis = “HY”

Back to main JSM 2005 Program page



Legend: = Applied Session, = Theme Session, = Presenter
Activity Number: 301
Type: Invited
Date/Time: Tuesday, August 9, 2005 : 2:00 PM to 3:50 PM
Sponsor: Section on Statistics and the Environment
Abstract - #302565
Title: Exact Bagging k-NN Predictors of Continuous Variables
Author(s): David Patterson*+ and Brian Steele
Companies: University of Montana and University of Montana
Address: Department of Mathematical Sciences, Missoula, MT, 59812,
Keywords:
Abstract:

This paper discusses two bagging extensions of k-nearest neighbor (k-NN) estimators for continuous random variables. The setting is nearly the same as for classification problems; that is, covariates are observed on all observations, but one or more additional variables of interest are unobserved, and the objective is to predict the unobserved, or target, values from the covariates. A training set is available in which all variables are observed; whereas, only the covariates are observed on the population. The first of these new predictors corresponds to drawing infinitely many bootstrap samples from the training set, finding the k-nearest neighbors among the training set and in the covariate space in each bootstrap sample to a point of interest, averaging the nearest neighbor target values, and averaging each of the bootstrap means. The second estimator is a local regression estimator based on the theoretical bootstrap expectation of a least squares estimator constructed from the k-nearest neighbors. These estimators are illustrated using an example of using remotely sensed covariates to predict forest variables.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2005 program

JSM 2005 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2005