|
Activity Number:
|
547
|
|
Type:
|
Contributed
|
|
Date/Time:
|
Thursday, August 2, 2007 : 10:30 AM to 12:20 PM
|
|
Sponsor:
|
Social Statistics Section
|
| Abstract - #309081 |
|
Title:
|
Statistical Tests for Differential Functioning in Parametric Item Response Theory: A Monte Carlo Evaluation of Conventional DFIT Methods and a Bootstrap Alternative
|
|
Author(s):
|
Gregory Petroski*+ and Steven J. Osterlind
|
|
Companies:
|
University of Missouri-Columbia and University of Missouri-Columbia
|
|
Address:
|
School of Medicine, Columbia, MO, 65212,
|
|
Keywords:
|
Item Response Theory ; Test Bias ; DIF ; Differential Item Functioning
|
|
Abstract:
|
Differential test and item functioning (DIF/DTF) occur when individuals with the same ability but from different segments of the examinee population have different probabilities of success on a test or test item. Raju, et al. (1992) proposed methods for detecting differential functioning of items and tests (DFIT) in the context of parametric item response theory for binary response items. The DFIT framework includes several tests of significance for DIF and DTF. A simulation study examined Type I error rates and power characteristics of the conventional DFIT tests and several bootstrap alternatives. Results indicate that the conventional tests have unacceptably high Type I error. A bootstrap test for DIF and DTF exhibited good control of Type I error and reasonable power when DIF items are excluded from the parameter equating step.
|