Saturday, November 12
Pretesting Methods
Data Quality and Measurement Error
Sat, Nov 12, 9:00 AM - 10:25 AM
Regency Ballroom-Monroe
New Approaches to Questionnaire Design and Evaluation

Dynamic Question Ordering in Online Surveys (303117)

*Kirstin Early, Carnegie Mellon University 
Stephen Fienberg, Carnegie Mellon University 
Jennifer Mankoff, Carnegie Mellon University 

Keywords: online surveys, adaptive survey design, dynamic question ordering

Survey response rates have been falling, leading to results that do not represent the full population. Unlike paper surveys, online surveys can support adaptive questions, where later questions depend on earlier responses. Past work has taken a rule-based approach, uniformly across all respondents. We envision a richer interpretation of adaptive questions, where question order is dynamic and personalized to the individual, depending on their previous answers. Such a dynamic question-ordering approach could increase engagement, and therefore response rate, as well as imputation quality for missing values. We present a general framework for dynamically ordering questions based on previous responses to engage respondents, improving survey completion and imputation of unknown items. Our work considers two scenarios for data collection. In the first, we want to maximize survey completion (and imputation quality) and so we focus on ordering questions to engage the respondent and collect hopefully all the information we seek, or at least the information that most characterizes the respondent so imputed values will be accurate. In the second scenario, our goal is to give the respondent a personalized prediction, based on information they provide. Since it is possible to give a reasonable prediction with only a subset of questions, we are not concerned with motivating the user to answer all questions. Instead, we want to order questions so that the user provides information that most reduces the uncertainty of our prediction, while not being too burdensome to answer. We illustrate applications of this framework with examples from the Survey of Income and Program Participation, where we want to maximize response rate and quality of imputation for unanswered items, and providing personalized energy estimates to prospective tenants, using the Residential Energy Consumption Survey. We also consider connections between our statistics-based question-ordering approach and cognitive survey methodology.