JSM 2012 Home

JSM 2012 Online Program

The views expressed here are those of the individual authors and not necessarily those of the JSM sponsors, their officers, or their staff.

Online Program Home

Abstract Details

Activity Number: 187
Type: Contributed
Date/Time: Monday, July 30, 2012 : 10:30 AM to 12:20 PM
Sponsor: Social Statistics Section
Abstract - #305817
Title: Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates
Author(s): Natalya Verbitsky-Savitz*+ and Kenneth Fortson and Emma Kopa and Philip Gleason
Companies: Mathematica Policy Research and Mathematica Policy Research and Mathematica Policy Research and Mathematica Policy Research
Address: 1100 1st Street, NE, 12th Floor, Washington, DC, 20002-4221, United States
Keywords: causal inference ; quasi-experimental designs ; education ; charter schools ; program evaluation
Abstract:

Randomized control trials (RCTs) are widely considered the gold standard in evaluating the impact of a social program. When an RCT is not feasible, quasi-experimental designs (QEDs) are often used. A popular class of QEDs uses a non-randomly selected comparison group to represent what would have happened to the treatment group had they not participated in the program. Under certain assumptions, QEDs can produce unbiased impact estimates; however, these assumptions are generally untestable in practice. We test the validity of four comparison group approaches-OLS regression modeling, exact matching, propensity score matching, and fixed effects modeling-comparing QED impact estimates from these methods with an experimental benchmark. The analysis uses data from an experimental evaluation of charter schools and comparison data for other students in the same school districts. We find that the use of pre-intervention baseline data considerably reduces but might not completely eliminate bias. While the matching and regression-based estimates do not greatly differ, the matching estimators perform slightly better than do estimators that rely on parametric assumptions.


The address information is for the authors that have a + after their name.
Authors who are presenting talks have a * after their name.

Back to the full JSM 2012 program




2012 JSM Online Program Home

For information, contact jsm@amstat.org or phone (888) 231-3473.

If you have questions about the Continuing Education program, please contact the Education Department.