JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 338
Type: Contributed
Date/Time: Tuesday, July 31, 2012 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Mining
Abstract - #305061
Title: Learning with Multiple Experts: Sparsity and Model Selection
Author(s): Rafael Izbicki*+ and Rafael Bassi Stern
Companies: Carnegie Mellon University and Carnegie Mellon University
Address: 2304 Murray Avenue, Pittsburgh, PA, 15217, United States
Keywords: Multiple Experts ; Crowdsourcing ; Sparsity ; Model Selection ; Identifiability ; Expectation Maximization

In many situations, one can obtain only noisy labels for the data. For example, in systems such as Amazon Mechanical Turk or in diagnostic tests which are cheaper but less accurate than a given golden standard. Recently, several models have been proposed to predict future labels in this situation. However, techniques of model selection such as empirical risk minimization cannot be used since the real labels are never observed. We propose a method which can be applied to noisy labels and provide some theoretical guarantees on it. We also introduce sparsity in a class of models and show its importance in real examples. We use our model selection tool for choosing the tuning parameter that induces sparsity. Finally, this methodology is compared to others through several sets of simulated and real data.

The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program

2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.