A Comparison of Cognitive Testing Methods and Sources: In-Person Versus Online Nonprobability and Probability Methods (303312)Jennifer Childs, U.S. Census Bureau
Aleia Clark Fobia, U.S. Census Bureau
*Jessica L. Holzberg, U.S. Census Bureau
Gerson Morales, U.S. Census Bureau
Keywords: crowdsourcing, cognitive interviewing, respondent demographics, unmoderated pretesting
Previous research has investigated whether survey researchers can obtain valuable cognitive testing feedback using unmoderated, online crowdsourcing services such as Amazon Mechanical Turk. Feedback from these crowdsourced cognitive interviews can be obtained quickly and inexpensively, but may vary in its usefulness and completeness. Without the ability to probe spontaneously, valuable information can be lost. However, despite this method's limitations, crowdsourced cognitive interviews seem to have potential as a component of a broader pretesting technique.
Research comparing feedback received from different crowdsourcing services such as Amazon Mechanical Turk, TryMyUI, and Facebook has also been conducted. This paper furthers this work by examining results from four different data sources, including two sources that have not yet been explored: a nonprobability "affinity" panel sample of users who opted in to participate in research with the U.S. Census Bureau, and a probability sample of email addresses from the Census Bureau's contact frame. The contact frame is comprised of email addresses and phone numbers that were purchased from five commercial data vendors from 2010--2015 and matched to addresses. While the coverage of the contact frame is imperfect, using the email addresses on this frame allows us to reach people who would not otherwise participate in this type of research.
We will present similarities and differences in respondent demographics and cognitive interview findings on statements about respondent privacy, confidentiality, and protections from the following sources:
1. In-person, traditional cognitive interviews 2. Online, unmoderated cognitive testing 2a. Nonprobability affinity panel sample (Census Bureau panel) 2b. Nonprobability crowdsourcing sample (Amazon Mechanical Turk) 2c. Probability sample (Census Bureau contact frame)