An Examination of Visual Design Effects in a Self-Administered Mail Survey
Catherine Billington
Westat
Sarah Hastedt
National Center for Education Statistics
Douglas Williams
Westat
In self-administered surveys, respondents call upon a cognitive tool kit in terms of expectations about the question and answer process. In the context of mail surveys, employing design features that mimic these expectations (e.g., up is good) can work toward minimizing errors in response. This paper builds on previous research using data from the 2009 National Household Education Survey (NHES) Pilot Test which found variation in omission and commission error rates according to the visual design of the skip instructions on the self-administered mail questionnaire. In this paper, we look at experiments implemented in the design of skip instructions and the order of response categories using data from the 2011 NHES Field Test. The 2011 Field Test used a split-ballot questionnaire experiment which allows for comparisons of item-level nonresponse and response distributions across forms. In our first analysis, we examine whether increasing the emphasis of skip instructions has any effect on skip errors. In our second analysis, we examine the effectiveness of skip pattern design changes that eliminated the most problematic type of skip instruction used in the 2009 Pilot Test, a large highlighted box containing a skip instruction, from one of the questionnaire forms. In our third analysis we look at simple dichotomous responses to see if reversing the order affects distributions (e.g., yes/no versus no/yes). Specifically, we examine whether switching the order of yes or no responses violates respondent expectations about what should come first and whether respondents rely more on their expectations than the survey when answering these questions.