Abstract Details
Activity Number:
|
251
|
Type:
|
Contributed
|
Date/Time:
|
Monday, August 4, 2014 : 2:00 PM to 3:50 PM
|
Sponsor:
|
Section on Statistical Learning and Data Mining
|
Abstract #312422
|
|
Title:
|
Training a Classifier for Optimal Classification Error
|
Author(s):
|
Frans H.J. Kanfer*+ and Ryno Potgieter and Sollie Millard
|
Companies:
|
University of Pretoria and University of Pretoria and University of Pretoria
|
Keywords:
|
Classification ;
prescribed misclassification rate ;
Sequential training ;
limited training cases
|
Abstract:
|
Training a classifier to a pre-determined level can be achieved by prescribing the misclassification rate and the certainty of obtaining such a rate. A sequential training procedure has been introduced in literature training a selected classifier to such requirements, with an additional advantage of limiting required training cases. A shortfall of the procedure is that it does not accounting for unfeasible specifications. This paper presents a sequential procedure which follows an approach of estimating the best feasible misclassification rate at a prescribed level of accuracy. It can also be applied to any classification algorithm and limit the training cases as required by the prescribed specification. Simulation results are presented for LDA and KNN classification. A micro array data application is also discussed.
|
Authors who are presenting talks have a * after their name.
Back to the full JSM 2014 program
|
2014 JSM Online Program Home
For information, contact jsm@amstat.org or phone (888) 231-3473.
If you have questions about the Professional Development program, please contact the Education Department.
The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.
Copyright © American Statistical Association.