Translating the Statistical Representation of the Effects of Education Interventions into More Readily Interpretable Forms
The primary purpose of this paper is to provide suggestions to researchers about ways to present statistical findings about the effects of educational interventions that might make the nature and magnitude of those effects easier to understand. These suggestions and the related discussion are framed within the context of studies that use experimental designs to compare measured outcomes for two groups of participants, one in an intervention condition and the other in a control condition.
Designing Cluster-Randomized Trials
The National Center for Education Research (NCER) in the Institute of Education Sciences (IES) of the U.S. Department of Education hosts an annual Summer Research Training Institute on Cluster-Randomized Trials to increase the national capacity of researchers to develop and conduct rigorous evaluations of the effectiveness of education interventions. To see video transcripts and presentation files from the 2008 Institute, click here. For the 2007 Institute presentations, click here.
CONSORT, which stands for Consolidated Standards of Reporting Trials, was developed to provide guidance on the tracking and reporting of critical aspects of randomized controlled trials (RCTs). The main initiative of the CONSORT group was the development of a set of recommendations for reporting RCTs, called the CONSORT Statement. The Statement includes a 22-item checklist, which focuses on study design, analysis, and interpretation of the results, and a flow diagram, which provides a structure for tracking participants at each study stage. IES encourages researchers to use these tools in their Goal 3 and Goal 4 research projects.
This paper, entitled "Statistical Power Analysis in Education Research" and coauthored by Larry Hedges and Christopher Rhoads provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. Click here to view, download and print the paper as a PDF file.
Goal Three and Goal Four applications submitted to IES will typically require a detailed power analysis. The Optimal Design Software for Multi-Level and Longitudinal Research is useful for statistical power analysis of group-level interventions. To download the free Optimal Design software and learn more about the project, click here.
Presentations on Methodological Issues
To view selected presentations, meeting agendas, and videos from previous IES Research Conferences, click here.
Technical Methods Resources
NCEE has formed a technical methods group to work on issues and strategies that assure evaluations of education interventions provide unbiased and causally valid assessments. The technical methods working group aims to advance and provide guidance for those specialists who are embarking on evaluations in education. To view recent Technical Methods Reports, click here.
Effect Sizes in Research on Children and Families
This special issue of Child Development Perspectives (December 2008, Volume 2 Issue 3) contains a number of articles on the application of effect sizes in research on children and families.
Articles on effect sizes included in the special issue of Child Development Perspectives. (142 KB)
WWC Standards for Single-Case Design Research
In an effort to expand the pool of scientific evidence available for review, the What Works Clearinghouse (WWC) assembled a panel of national experts in single-case design (SCD) and analysis to draft SCD Standards. In this paper, the panel provides an overview of SCDs, specifies the types of questions that SCDs are designed to answer, and discusses the internal validity of SCDs. The panel then proposes SCD Standards to be implemented by the WWC.
View, download, and print as a PDF file. (446 KB)