Thursday, November 10
Questionnaire Design
Thu, Nov 10, 1:30 PM - 2:55 PM
Regency Ballroom-Monroe
Tackling Response Burden: What Can Questionnaire Designers Do?

Salience of Survey Burden and Its Effects on Response Behavior (303137)

Stephanie Eckman, RTI International 
*Frauke Kreuter, University of Maryland/University of Mannheim/IAB 
Roger Tourangeau, Westat 

Keywords: Filter Questions, Response Burden, Motivated Underreporting, Measurement Error, Visual Display, Web Surveys

Survey questionnaires are often full of skip patterns, which allow respondents to skip over sections or follow-up questions that do not apply to them and thus proceed through the interview faster. As survey designers try to create short modularized questionnaires in response to survey-taking on mobile devices, features such as skip patterns and filter questions become more and more important. Once respondents learn how such questions work, however, they may give false answers to the filter questions to avoid the follow-ups. To combat this behavior, some surveys ask the filter questions in a block early in the questionnaire and administer the follow-up questions later on, rather than interweaving the filter and follow-up questions. Filter items are more likely to elicit answers that trigger follow-ups when they are asked in a block, before any follow-up questions, than when the filter and follow-up questions are interleafed. Eckman et al. showed that this effect is indeed due to a tendency by respondents to avoid response burden, a phenomenon they called motivated underreporting, and also demonstrated that answers to the filter questions tend to be more accurate when they are asked in a group.

However, survey designers often have good reasons to present filter questions in interleafed format, easier cognitive processing being one of them. When thinking about the answer to one filter question, respondents are already on that topic and it would be natural to follow up. Thus the issue arises: How can series of filter and follow-up questions be presented to reduce the risk of motivated underreporting without having to ask all filter questions upfront? The answer will depend on how obvious design features are that signal increased burden as a function of certain prior answer choices. Couper and his colleagues demonstrated that the salience of question contingency, placing filter and follow-up items into a single-grid vs. placing the follow-up items on multiple screens in a web survey, did reduce filter endorsements. Kreuter et al. found that changing the topic reduced the difference between filter questions presented in a block and in interleafed format, leading them to suspect that a re-setting effect takes place in respondents’ minds. These findings led us to design experiments in which we varied in some form the salience of the repetitive nature of filter and follow-up questions.

We manipulated the visible effects of answering yes or no to a filter question, varied the character of the follow-up questions, and varied the topics within the question series. We also replicated the single vs. multiple-pages design introduced by Couper et al. The presented here come from a web-survey experiment in the U.S. where half of the sample was drawn from the Survey Sampling International’s online panel, an opt-in Web panel of more than 1.3 million persons who have signed up on line to receive survey invitations. The study involved roughly 2500 respondents. In addition we fielded experiments within a mixed-mode survey in Germany with respondents sampled from administrative data from German employment and unemployment records. A random half was invited to participate by web the other recruited via telephone respectively. Both surveys had approximately 1.500 participants.

The results confirmed the re-setting effect, which changes in topic removing the salience of the filtering patterns. Using slightly varied follow-up questions and reducing the repetitiveness of the tasks increased endorsements to filter questions by about 20% and thus successfully mitigated the effect of motivated underreporting. Likewise a visualization of the filtering by greying out items that no longer need to be answered reduced the endorsements, although not as strongly as the design choice between displaying them on a single page with the filter questions vs multiple pages.

The paper proposed for the conference will provide a summary over the existing literature on this topic, report the new experimental findings sketched out above, and concludes with suggestions for questionnaire construction, in particular in self-administered formats.