Abstract:
|
Not-reached (dropout) and omitted (intermittent missingness) items are often inevitable in timed tests where answers are not required. The missingness of the item response may be related to the subject's latent characteristics, the difficulty, or even the unobserved response of the item. To fully understand the underlying results of the testing, we must handle the missing data appropriately. In this article, we propose a new missing data mechanism, which jointly studies the not-reached and omitted behaviors for the multilevel item response theory (IRT) model. This proposed methodology is illustrated using real data from the Program for International Student Assessment (PISA) 2015 study. A modified deviance information criterion (DIC) is developed to assess model fit. Extensive simulations are conducted to further illustrate the generality of the proposed model and show that our proposed model compares favorably with another competing model.
|