Skip Navigation

2006Research Conference | June 15–16

This conference highlighted the work of invited speakers, independent researchers who have received grant funds from the Institute of Education Sciences, and trainees supported through predoctoral training grants and postdoctoral fellowships. The presentations are those of the authors and do not necessarily represent the views of the U.S. Department of Education or the Institute of Education Sciences.
Hyatt Regency Washington on Capitol Hill
400 New Jersey Avenue, N.W.
Select Conference Year

The Cause . . . or the 'What' of What Works? : Abstract

David S. Cordray, Vanderbilt University

Within an experimental framework, the cause of an effect is best characterized as the difference between all causal variables in the treatment group and all causal variables in the counterfactual group. Just as the "effect" is the average difference between groups on the outcomes of interest, the "what" of what works claims is also a difference in the relative strength of the intervention conditions. So, a 12-step professional development program is not the cause of increased student achievement, it is the difference in professional development available in the treatment and control conditions. Just as the effect is the average difference in student performance. Moreover, in applied settings, interventions are often not implemented with full fidelity and control conditions can be "upgraded" by the inadvertent leakage of treatment components and other processes. This is a second form of infidelity that weakens the treatment-control contrast. The illustration shows that the observed effect size of 0.50 underestimates the true effects size (d=0.83) because of multiple sources of infidelity. Fidelity must be examined, empirically, if we are to properly understand "the what" of what works claims. The presentation defines fidelity, examines its role in various types of IES projects (Goals), and enumerates several challenges for the future.