|How Does Implementation Mediate Program Impacts? Improving Current Evaluation Research Methods
|Abt Associates, Inc.
|Statistical and Research Methodology in Education–Early Career [Program Details]
|1 1/2 years (7/1/15–12/31/16)
Co-Principal Investigator: Moulton, Shawn
Purpose: The purpose of this project was to advance evaluation research methods that address questions about the mediational influences of education initiatives. The structure of education interventions makes investigating the relationship between implementation and impact particularly challenging, because education interventions frequently work through classroom teachers.
Project Activities: This research extends the Analysis of Symmetrically Predicted Endogenous Subgroups (ASPES) method to be more broadly applicable in education research and compares this and other analytic approaches—including instrumental variables and propensity score matching approaches used in the context of experiments—to determine the conditions when each is most appropriate. ASPES uses baseline characteristics to predict the post-randomization experiences (mediator) of sample members, subsequently establishing symmetric treatment and control predicted subgroups such that the differences between their mean outcomes are unbiased by selection or other sources of bias. A final step converts these unbiased impacts on predicted subgroups to reflect the impacts on actual subgroup members, by assumption.
Through Monte Carlo simulation, the researchers will investigate the accuracy of the estimation of mediation effects of multiple analytic approaches under ideal modeling conditions in which all assumptions are met, under conditions in which key assumptions of the approaches are violated, and under conditions with nonlinear functional forms. Researchers will use all of the approaches to address applied questions using the Evaluation of Comprehensive Teacher Induction (CTI) data set. This will provide both an applied demonstration of the ASPES investigated in the study, an illustration of the differences in results obtained by using various approaches on the same data, and ways to diagnose potential violations of key assumptions in real data. The research team will disseminate the results of the research and recommendations for practice via peer-reviewed conference presentations and journal articles, as well as an online short course to teach applied researchers how to use ASPES and in-person workshops.