Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: A User-Friendly Tool for Designing Cluster Randomized Trials with Power and Relevance
Center: NCER Year: 2017
Principal Investigator: Tipton, Elizabeth Awardee: Northwestern University
Program: Statistical and Research Methodology in Education      [Program Details]
Award Period: 3 years (07/01/17–06/30/20) Award Amount: $893,737
Type: Methodological Innovation Award Number: R305D180002

Previous Award Number: R305D170024
Previous Awardee: Teachers College, Columbia University

Co-Principal Investigator: Jessaca Spybrook (Western Michigan University)

The purpose of this project is to develop a user-friendly webtool for planning cluster randomized trials (CRTs) based on generalizability and statistical power for testing moderator effects, not just average treatment effects. Funding institutions and statistical developments have started to shift the focus of causal impact studies beyond the detection of average treatment impact to understanding moderators of these impacts and the detection of effects for specific inference populations, but the tools available to applied researchers for designing studies based on these factors are limited. The findings and products from this study will help improve the capacity of education researchers to design CRTs with both high internal and external validity, thereby increasing the relevance of applied education research results.

The research team will first gather data on practices used in designs of Efficacy, Replication and Effectiveness studies funded through the Education Research Grants Program at the Institute of Education Sciences. These data will come from interviews with 54 principal investigators from of Efficacy, Replication and Effectiveness grants funded from 2005–2015. New methods will then be developed to bridge gaps between what are typically treated as three separate design considerations: generalizability; power to detect the average treatment effect; and power to detect moderator effects. As these new methods are developed, so too will software which applied researchers can use to address these considerations. The software will go through multiple stages of usability testing, and the final product will be free and publicly available. In addition to the software, the researchers plan to disseminate their findings through peer-reviewed journal manuscripts, conference presentations, and workshops at major conferences.