|Title:||An Empirical Analysis of Two Methodological Issues for Education Research Caused by Variation in Program Impacts|
|Principal Investigator:||Weiss, Michael||Awardee:||MDRC|
|Program:||Statistical and Research Methodology in Education [Program Details]|
|Award Period:||3 years (9/1/14 – 8/31/17)||Award Amount:||$898,875|
|Goal:||Methodological Innovation||Award Number:||R305D140012|
Co-Principal Investigator: Howard Bloom
The purpose of the project is to investigate two issues pertinent to the variation in treatment effects across sites in a multisite trial. The first issue is the impact of cross-site variation on a study’s statistical power to detect the mean treatment effect across all sites. While it is mathematically established how to account for such variation, there is little information available that helps researchers gauge the potential amount of the variation and thus the extent to which researchers need to compensate for it via other aspects of a design. The project team will use data from ten multisite randomized control trials of education interventions to estimate the cross-site variability in treatment effects. The findings from these data will provide a starting point for researchers who need to account for cross-site variability.
The second issue that researchers will address is individual-level variation in program effects. Regression discontinuity designs have strong internal validity at the cutpoint. Rather than assume that there is no validity at any other point, the research team will work to develop a way to gauge the potential validity at other points. Such generalization, however, is complicated to the extent that there is individual variation in treatment effects, so any new approach will need to take into account the individual-level variation.
Journal article, monograph, or newsletter
Bloom, H.S., Raudenbush, S.W., Weiss, M.J., and Porter, K. (2017). Using Multisite Experiments to Study Cross-Site Variation in Treatment Effects: A Hybrid Approach With Fixed Intercepts and a Random Treatment Coefficient. Journal of Research on Educational Effectiveness, 10(4), 817–842.
Weiss, M.J., Bloom, H.S., Verbitsky-Savitz, N., Gupta, H., Vigil, A.E., and Cullinan, D.N. (2017). How Much do the Effects of Education and Training Programs Vary Across Sites? Evidence From Past Multisite Randomized Trials. Journal of Research on Educational Effectiveness, 10(4), 843–876.