JSM 2011 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Abstract Details

Activity Number: 341
Type: Contributed
Date/Time: Tuesday, August 2, 2011 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Mining
Abstract - #301616
Title: Naturally Efficient Sparsity Tuner for Kernel Regression
Author(s): Ernest Fokoue*+
Companies: Rochester Institute of Technology
Address: 98 Lomb Memorial Drive, Rochester, NY, 14623, USA
Keywords: Regression ; Information Matrix ; Sparsity ; Structured Prior Matrix ; Kernel ; Support Points
Abstract:

We propose a novel approach to achieving sparse representation in kernel regression through a straightforward algorithm that consists in a refinement of the maximum a posteriori (MAP) estimator of the weights of the kernel expansion. Our proposed method combines structured prior matrices and functions of the information matrix to zero in on a very sparse representation. We show computationally that our naturally efficient sparsity tuner (NEST) achieves a very sparse and predictively accurate estimator of the underlying function, for a variety of choices of the covariance matrix of our Gaussian prior over the weights of the kernel expansion. Our computational comparisons on both artificial and real examples show that our method compete very well - usually favorably - with the Support Vector Machine, the Relevance Vector Machine and Gaussian Process regressors.


The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2011 program




2011 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.