Online Program Home
  My Program

Abstract Details

Activity Number: 321 - Modern Statistical Learning for Ranking and Crowdsourcing
Type: Topic Contributed
Date/Time: Tuesday, August 1, 2017 : 10:30 AM to 12:20 PM
Sponsor: Section on Statistical Learning and Data Science
Abstract #322608 View Presentation
Title: Optimal Stopping and Worker Selection in Crowdsourcing: An AdaptiveSequential Probability Ratio Test Framework
Author(s): Xi Chen and Xiaoou Li and Jingcheng Liu and Zhiliang Ying* and Yunxiao Chen
Companies: NYU and University of Minnesota Twin Cities and Columbia University and Columbia University and Emory University
Keywords: crowdsourcing ; sequential probability ratio test ; adaptive control ; empirical Bayes
Abstract:

In this talk, we propose an adaptive sequential probability ratio test (Ada-SPRT) that obtains the optimal experiment selection rule, stopping time, and final decision rule under a single Bayesian decision framework. Our motivating application comes from binary labeling tasks in crowdsourcing, where the requestor needs to simultaneously decide which worker to choose to provide the label and when to stop collecting labels to save for budget. We characterize the structure of the optimal adaptive sequential design that minimizes the Bayes risk through log-likelihood ratio statistic and develop dynamic programming based algorithms for both non-truncated and truncated tests. We further propose to adopt empirical Bayes approach for estimating class priors and an EM algorithm for estimating workers' quality. This is a joint work with Xiaoou Li, Yunxiao Chen, Jingchen Liu and Zhiliang Ying.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2017 program

 
 
Copyright © American Statistical Association