JSM 2004 - Toronto

Abstract #301093

This is the preliminary program for the 2004 Joint Statistical Meetings in Toronto, Canada. Currently included in this program is the "technical" program, schedule of invited, topic contributed, regular contributed and poster sessions; Continuing Education courses (August 7-10, 2004); and Committee and Business Meetings. This on-line program will be updated frequently to reflect the most current revisions.

To View the Program:
You may choose to view all activities of the program or just parts of it at any one time. All activities are arranged by date and time.

The views expressed here are those of the individual authors
and not necessarily those of the ASA or its board, officers, or staff.


Back to main JSM 2004 Program page



Activity Number: 160
Type: Contributed
Date/Time: Monday, August 9, 2004 : 2:00 PM to 3:50 PM
Sponsor: Section on Survey Research Methods
Abstract - #301093
Title: Why Large Design Effects Can Occur in Complex Sample Designs: Examples from the NHANES 1999-2000 Survey
Author(s): David A. Lacher*+ and Lester R. Curtin and Jeffery P. Hughes
Companies: National Center for Health Statistics and Centers for Disease Control and Prevention and National Center for Health Statistics
Address: 3311 Toledo Rd., Hyattsville, MD, 20782,
Keywords: design effect ; complex sample ; variance estimation ; NHANES ; laboratory test
Abstract:

The design effect is defined as the ratio of the "true" sampling variance to the hypothetical simple random sample variance assuming the same point estimate. Design effects are used to compare alternative designs to determine the effective sample size for analysis and to adjust confidence intervals for estimates based on complex designs. Design effects have been examined for the National Health and Nutrition Examination Survey (NHANES) for 1999-2000. Although design effects for most estimates based on NHANES were less than three, design effects for means of laboratory tests were much higher, ranging up to 40. Design effects of means were larger for laboratory tests with small between-person coefficients of variation. It was determined that small systematic shifts in laboratory test values apparently led to larger between-sampling unit component of sampling error and increased the design effect. Simulations were done to compare the effect of analytical bias and the effect of precision of laboratory methods on the design effect of the mean. For the NHANES data, the bias of the laboratory method influenced the design effects of means more than the precision of the laboratory test.


  • The address information is for the authors that have a + after their name.
  • Authors who are presenting talks have a * after their name.

Back to the full JSM 2004 program

JSM 2004 For information, contact jsm@amstat.org or phone (888) 231-3473. If you have questions about the Continuing Education program, please contact the Education Department.
Revised March 2004