|Title:||Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates|
This NCEE Technical Methods Paper compares the estimated impacts of the offer of charter school enrollment using an experimental design and a non-experimental comparison group design. The study examined four different approaches to creating non-experimental comparison groups ordinary least squares regression modeling, exact matching, propensity score matching, and fixed effects modeling. The data for the study are from students in the districts and grades that were represented in an experimental design evaluation of charter schools conducted by the U.S. Department of Education in 2010 (For more information, see: http://ies.ed.gov/ncee/pubs/20104029/index.asp)
The study found that none of the comparison group designs reliably replicated the impact estimates from the experimental design study. However, the use of pre-intervention baseline data that are strongly predictive of the key outcome measures considerably reduced, but did not eliminate the estimated bias in the non-experimental impact estimates. Estimated impacts based on matched comparison groups were more similar to the experimental estimators than were the estimates based on the regression adjustments alone, the differences are moderate in size, although not statistically significant.
Need Help Viewing PDF files?
|Cover Date:||April 2012|
|Web Release:||April 26, 2012|
|Publication #:||NCEE 20124019
|Authors:||Kenneth Fortson, Natalya Verbitsky-Savitz, Emma Kopa, and Philip Gleason|
|Type of Product:||Technical Methods Report|
For questions about the content of this Technical Methods Report, please contact: