Saturday, November 12
Data Quality and Measurement Error
Sat, Nov 12, 1:45 PM - 3:10 PM
Regency Ballroom-Monroe
Assessing Alternative Ways of Collecting Data

Designing a Mail Questionnaire for Multiple Household Members (303116)

J. Michael Brick, Westat 
Pat Dean Brick, Westat 
W. Sherman Edwards, Westat 
Pamela Giambo, Westat 
Michael Planty, Bureau of Justice Statistics 
*Douglas Williams, Westat 

Keywords: household informant, burden, measurement focus

Selecting a probability sample of individuals in self-administered mail or web surveys is problematic. Within-household sampling algorithms done by the respondent are generally unreliable, and the alternative of asking for each eligible household member to respond increases burden and may result in lower response rates. Relying on a household informant is an approach that avoids this problem, but the informant may not know how others would respond or might lack motivation. This paper reviews an experiment to inform a key questionnaire design decision for collecting behavioral information about multiple household members from a household informant. The experiment compares focusing on collecting information on behavioral events (crime incident) for all persons to collecting each adult’s experience.

The Bureau of Justice Statistics (BJS) and Westat developed a companion survey (CS) to the National Crime Victimization Survey (NCVS). The NCVS is a national household panel survey that collects incident detail about criminal victimization of all household members aged 12 and older. The CS is a mail survey that was field tested in late 2015 with a sample of about 225,000 households in 40 large U.S. metropolitan areas. The field test included an experiment evaluating a person-level and an incident-level questionnaire design. The person-level asked a series of questions about the victimization experience of each adult; the incident-level asked questions about victimization incidents and associated reported events with the victimized adult. Both alternatives have potential strengths and weaknesses. The questionnaire design alternatives might show differential effects on unit response rates, completeness of reporting in households with more than one adult, item missing rates, and key outcome estimates, including the number of persons who experience victimizations. This paper describes these outcomes and the implications for questionnaire design.