Skip to main content

Breadcrumb

Home arrow_forward_ios Information on ... arrow_forward_ios A User-Friendly ...
Home arrow_forward_ios ... arrow_forward_ios A User-Friendly ...
Information on ...
Grant Closed

A User-Friendly Tool for Designing Cluster Randomized Trials with Power and Relevance

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $893,737
Principal investigator: Elizabeth Tipton
Awardee:
Northwestern University
Year: 2017
Award period: 3 years (07/01/2017 - 06/30/2020)
Project type:
Methodological Innovation
Award number: R305D180002

Purpose

The purpose of this project was to develop a user-friendly webtool for planning cluster randomized trials (CRTs) based on generalizability and statistical power for testing moderator effects, not just average treatment effects. Funding institutions and statistical developments have started to shift the focus of causal impact studies beyond the detection of average treatment impact to understanding moderators of these impacts and the detection of effects for specific inference populations, but the tools available to applied researchers for designing studies based on these factors are limited. The findings and products from this study will help improve the capacity of education researchers to design CRTs with both high internal and external validity, thereby increasing the relevance of applied education research results.

Project Activities

The research team gathered data on practices used in designs of Efficacy, Replication, and Effectiveness studies funded through the Education Research Grants Program at the Institute of Education Sciences (IES). These data came from interviews with 37 principal investigators from grants funded by IES between 2010–2015. The researchers developed new methods to bridge gaps between what are typically treated as three separate design considerations: generalizability, power to detect the average treatment effect, and power to detect moderator effects. Additionally, the researchers developed software so that applied researchers could address these considerations. The software went through multiple stages of usability testing, and the final product is free and publicly available. The researchers disseminated their findings through peer-reviewed journal manuscripts, conference presentations, and workshops.

Key outcomes

The main findings of this project are as follows:

  • The project team developed new methods for designing studies including methods for planning for adequate power for tests of hypotheses regarding moderators, and methods for planning studies with multiple target estimands, such as multiple populations or moderator effects (Dong, Kelcey & Spybrook, 2018; Zhang, Spybrook, & Unlu, 2020; Tipton, 2022).
  • The study team found the following results from the information collected on common practices regarding recruitment, research designs, and moderator analyses in efficacy studies in education from a sample of projects funded by IES between 2010 to 2015 (Spybrook, et. al. 2020; Titpon, et. al. 2021):
    • Studies did not clearly report the population they were trying to represent or for whom their results would generalize.
    • Most study samples did not represent populations of schools other than those in the district they fell within.
    • Schools from large school districts were overrepresented relative to the population of public schools nationwide.
    • The samples included were more homogeneous than the population.
    • The research designs had adequate power for tests of individual level moderators but not school level moderators.

People and institutions involved

IES program contact(s)

Allen Ruby

Project contributors

Jessaca Spybrook

Co-principal investigator

Products and publications

ERIC Citations: Find available citations in ERIC for this award here or here.

Publicly Available Data:

  • The project team developed an R package, 'generalizeR', which is available at Northwestern and will be submitted to CRAN.

Additional online resources and information:

  • The project team developed a user-friendly webtool for planning randomized trials in education that integrates planning for generalizability and planning for statistical power to support the design of RCTs with both high internal and external validity. This free webtool is available at The Generalizer (www.thegeneralizer.org).
  • Papers and resources related to the project can be found on the STEPP Center website at https://steppcenter.northwestern.edu/.

Select Publications:

Dong, N., Kelcey, B., & Spybrook, J. (2018). Power analyses for moderator effects in three-level cluster randomized trials. The Journal of Experimental Education, 86(3), 489-514.

Spybrook, J., Zhang, Q., Kelcey, B., & Dong, N. (2020). Learning from cluster randomized trials in education: An assessment of the capacity of studies to determine what works, for whom, and under what conditions. Educational Evaluation and Policy Analysis, 42(3), 354-374.

Tipton, E. (2021). Beyond generalization of the ATE: Designing randomized trials to understand treatment effect heterogeneity. Journal of the Royal Statistical Society Series A, 184(2), 504-521.

Tipton, E., Spybrook, J., Fitzgerald, K. G., Wang, Q., & Davidson, C. (2021). Toward a system of evidence for all: Current practices and future opportunities in 37 randomized trials. Educational Researcher, 50(3), 145-156.

Tipton, E. (2022). Sample selection in randomized trials with multiple target populations. American Journal of Evaluation, 43(1), 70-89.

Zhang, Q., Spybrook, J., & Unlu, F. (2020). Examining design and statistical power for planning cluster randomized trials aimed at improving student science achievement and science teacher outcomes. AERA Open, 6(3).

Additional project information

Previous award details:

Previous award number:
R305D170024
Previous awardee:
Teachers College, Columbia University

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsMathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Rectangle Blue 1 Pattern 1
Thought leadership

What's on the Horizon for the 2026 Nation's Report...

January 21, 2026 by Matthew Soldner
Read More
Zoomed in IES logo
Other

Expanding School Supports for Kinship Caregivers a...

January 16, 2026
Read More
Zoomed in IES logo
Blog

Using Feedback to Drive Education Dashboard Improv...

January 12, 2026 by Georgia Bock
Read More
icon-dot-govicon-https icon-quote