Learning from Mouse Movements: Improving Questionnaires and Respondents’ User Experience Through Passive Data Collection (303161)Felix Henninger, University of Koblenz-Landau/University of Mannheim
*Rachel Horwitz, U.S. Census Bureau
Florian Keusch, University of Mannheim
Pascal J. Kieslich, University of Mannheim
Frauke Kreuter, University of Maryland/University of Mannheim/IAB
Malte Schierholz, Institute for Employment Research
Keywords: Web surveys, Mouse tracking, Responsive design, Data quality, Questionnaire design
Web surveys have become a standard, and often preferred, mode of data collection due to lower costs compared to other modes and because the technology behind them is much more adaptable. This technology is often used to help guide respondents through a survey, but it can also be used to target specific respondents who may need assistance. One such feature that has been used to identify these respondents is mouse movements. Mouse movements have been used to measure interest and uncertainty in e-learning and web design and to measure data quality and respondent difficulty in surveys. Specifically, researchers have measured the total distance traveled, the cursor’s trajectory, and specific patterns of movement. However, in survey research, only the total distance traveled has been automated. The current study aims to develop automated procedures for detecting and quantifying difficulty indicators in web studies by building on the distance traveled, the shape of the cursor’s trajectory, and specific patterns previously identified. In addition, the current study relies on recent methodological advances in psychology that propose mouse-tracking measures for assessing the tentative commitments to, and conflict between, response alternatives. In the study presented here, we collect and log participants' mouse movements as they complete an online survey. The survey includes factual, opinion, and problem-solving questions with a variety of response formats, such as radio buttons and slide bars. This variety of question types and formats allows us to assess the degree to which pre-defined patterns are predictive of difficulty in each format and to determine whether specific movements are more relevant for particular question types. Additionally, we vary experimentally the difficulty and complexity of items between respondents to show how complexity affects behavior, and draw upon administrative data to measure how uncertainty affects response accuracy.