Abstract #301909

This is the preliminary program for the 2003 Joint Statistical Meetings in San Francisco, California. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 2-5, 2003); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2003 Program page



JSM 2003 Abstract #301909
Activity Number: 159
Type: Invited
Date/Time: Monday, August 4, 2003 : 2:00 PM to 3:50 PM
Sponsor: IMS
Abstract - #301909
Title: Basis Pursuit Support Vector Machines
Author(s): Hao Helen Zhang*+
Companies: North Carolina State University
Address: Dept. of Statistics, Raleigh, NC, 27695-0001,
Keywords: support vector machines ; basis pursuit ; variable selection ; smoothing spline ; ANOVA ; classification
Abstract:

In many classification problems, not all the attributes are necessarily required for making predictions, and the accuracy may be increased by decreasing the number of variables used when some inputs are irrelevant or redundant to outcome. Basis pursuit support vector machine (BPSVM) is proposed to select important covariates when simultaneously training the classification rule, and obtain a compact and accurate classifier based only on important variables. In setting of a tensor product reproducing kernel Hilbert space (RKHS), the target function is decomposed into the sum of different functional components, each represented by appropriate basis functions. BPSVM minimizes the generalized comparative Kullback-Leibler distance, regulated with the l_1 penalty on the coefficients of basis terms. The solutions of BPSVM are sparse in both samples and variables; when the parameters are chosen appropriately and the RKHS is rich enough, the BPSVM estimate approaches Bayes classification rule. Numerically, BPSVM is easily implemented by solving a constrained linear programming problem. Simulations and real examples show that BPSVM is a potential tool in a wide range of applications.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2003 program

JSM 2003 For information, contact meetings@amstat.org or phone (703) 684-1221. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2003