IES Grant
Title: | A User-Friendly Tool for Designing Cluster Randomized Trials with Power and Relevance | ||
Center: | NCER | Year: | 2017 |
Principal Investigator: | Tipton, Elizabeth | Awardee: | Northwestern University |
Program: | Statistical and Research Methodology in Education [Program Details] | ||
Award Period: | 3 years (07/01/17 – 06/30/20) | Award Amount: | $893,737 |
Type: | Methodological Innovation | Award Number: | R305D180002 |
Description: | Previous Award Number: R305D170024 Co-Principal Investigator: Spybrook, Jessaca Purpose: The purpose of this project was to develop a user-friendly webtool for planning cluster randomized trials (CRTs) based on generalizability and statistical power for testing moderator effects, not just average treatment effects. Funding institutions and statistical developments have started to shift the focus of causal impact studies beyond the detection of average treatment impact to understanding moderators of these impacts and the detection of effects for specific inference populations, but the tools available to applied researchers for designing studies based on these factors are limited. The findings and products from this study will help improve the capacity of education researchers to design CRTs with both high internal and external validity, thereby increasing the relevance of applied education research results. Project Activities: The research team gathered data on practices used in designs of Efficacy, Replication, and Effectiveness studies funded through the Education Research Grants Program at the Institute of Education Sciences (IES). These data came from interviews with 37 principal investigators from grants funded by IES between 2010–2015. The researchers developed new methods to bridge gaps between what are typically treated as three separate design considerations: generalizability, power to detect the average treatment effect, and power to detect moderator effects. Additionally, the researchers developed software so that applied researchers could address these considerations. The software went through multiple stages of usability testing, and the final product is free and publicly available. The researchers disseminated their findings through peer-reviewed journal manuscripts, conference presentations, and workshops. Key Outcomes: The main findings of this project are as follows:
Products and Publications ERIC Citations: Find available citations in ERIC for this award here or here. Publicly Available Data:
Additional online resources and information:
Select Publications: Dong, N., Kelcey, B., & Spybrook, J. (2018). Power analyses for moderator effects in three-level cluster randomized trials. The Journal of Experimental Education, 86(3), 489–514. Spybrook, J., Zhang, Q., Kelcey, B., & Dong, N. (2020). Learning from cluster randomized trials in education: An assessment of the capacity of studies to determine what works, for whom, and under what conditions. Educational Evaluation and Policy Analysis, 42(3), 354–374. Tipton, E. (2021). Beyond generalization of the ATE: Designing randomized trials to understand treatment effect heterogeneity. Journal of the Royal Statistical Society Series A, 184(2), 504–521. Tipton, E., Spybrook, J., Fitzgerald, K. G., Wang, Q., & Davidson, C. (2021). Toward a system of evidence for all: Current practices and future opportunities in 37 randomized trials. Educational Researcher, 50(3), 145–156. Tipton, E. (2022). Sample selection in randomized trials with multiple target populations. American Journal of Evaluation, 43(1), 70–89. Zhang, Q., Spybrook, J., & Unlu, F. (2020). Examining design and statistical power for planning cluster randomized trials aimed at improving student science achievement and science teacher outcomes. AERA Open, 6(3). |
||
Back |