Skip to main content

Breadcrumb

Home arrow_forward_ios Information on ... arrow_forward_ios An Empirical An ...
Home arrow_forward_ios ... arrow_forward_ios An Empirical An ...
Information on ...
Grant Closed

An Empirical Analysis of Two Methodological Issues for Education Research Caused by Variation in Program Impacts

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $898,875
Principal investigator: Michael Weiss
Awardee:
MDRC
Year: 2014
Award period: 3 years (09/01/2014 - 08/31/2017)
Project type:
Methodological Innovation
Award number: R305D140012

Purpose

The purpose of the project is to investigate two issues pertinent to the variation in treatment effects across sites in a multisite trial. The first issue is the impact of cross-site variation on a study's statistical power to detect the mean treatment effect across all sites. While it is mathematically established how to account for such variation, there is little information available that helps researchers gauge the potential amount of the variation and thus the extent to which researchers need to compensate for it via other aspects of a design. The project team will use data from ten multisite randomized control trials of education interventions to estimate the cross-site variability in treatment effects. The findings from these data will provide a starting point for researchers who need to account for cross-site variability.

The second issue that researchers will address is individual-level variation in program effects. Regression discontinuity designs have strong internal validity at the cutpoint. Rather than assume that there is no validity at any other point, the research team will work to develop a way to gauge the potential validity at other points. Such generalization, however, is complicated to the extent that there is individual variation in treatment effects, so any new approach will need to take into account the individual-level variation.

People and institutions involved

IES program contact(s)

Allen Ruby

Project contributors

Howard Bloom

Co-principal investigator
Chief Social Scientist
MDRC

Products and publications

Journal article, monograph, or newsletter

Bloom, H.S., Raudenbush, S.W., Weiss, M.J., and Porter, K. (2017). Using Multisite Experiments to Study Cross-Site Variation in Treatment Effects: A Hybrid Approach With Fixed Intercepts and a Random Treatment Coefficient. Journal of Research on Educational Effectiveness, 10(4), 817-842.

Weiss, M.J., Bloom, H.S., Verbitsky-Savitz, N., Gupta, H., Vigil, A.E., and Cullinan, D.N. (2017). How Much do the Effects of Education and Training Programs Vary Across Sites? Evidence From Past Multisite Randomized Trials. Journal of Research on Educational Effectiveness, 10(4), 843-876.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsMathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Other

Expanding School Supports for Kinship Caregivers a...

January 16, 2026
Read More
Zoomed in IES logo
Blog

Using Feedback to Drive Education Dashboard Improv...

January 12, 2026 by Georgia Bock
Read More
Zoomed in IES logo
Other Resource

Improving Data Dashboards: A Feedback Process to E...

Author(s): U.S. Department of Education
Read More
icon-dot-govicon-https icon-quote