Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Site Selection When Participation i ...
Home arrow_forward_ios ... arrow_forward_ios Site Selection When Participation i ...
Information on IES-Funded Research
Grant Closed

Site Selection When Participation is Voluntary: Improving the External Validity of Randomized Trials

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $899,034
Principal investigator: Robert Olsen
Awardee:
Westat
Year: 2019
Project type:
Methodological Innovation
Award number: R305D190020

Purpose

Randomized controlled trials (RCTs) in education are typically conducted in districts and schools that were not selected to represent broader populations of policy interest (e.g., all schools nationwide that could have implemented the education intervention). As a result, findings from these studies may not generalize to these populations. In this project, the research team developed products to help education researchers decide whether to use random or balanced sampling in selecting sites for RCTs in education and help education researchers implement their preferred site selection methods.

Project Activities

The project team carried out two sets of activities. First, the team compared the performance of different methods for selecting sites—and replacing sites that refuse to participate—in controlling external validity bias randomized trials of educational interventions. Second the team developed and disseminated computer code and other resources that other researchers can use to implement different sampling and analysis methods for improving the generalizability of randomized trials and other impact studies.

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Project Website: https://www.generalizability.org/

Additional Online Resources and Information:

  • https://cran.r-project.org/web/packages/sitepickR/index.html - an R package for selecting districts and schools using the cube method
  • https://github.com/select-sites/exval - a SAS macro for stratifying a population frame of schools using cluster analysis and a SAS macro for selecting stratified random and stratified balanced samples of districts and schools

Select Publications:

Litwok, D., Nichols, A., Shivji, A., & Olsen, R. B. (2023). Selecting districts and schools for impact studies in education: A simulation study of different strategies. Journal of Research on Educational Effectiveness, 16(3), 501-531.

Project website:

https://www.generalizability.org/

Related projects

Testing Different Methods of Improving the External Validity of Impact Evaluations in Education

R305D100041

Supplemental information

Co-Principal Investigator: Bell, Stephen

  • The research team published details on the external validity bias that results from different approaches to selecting sites (e.g., districts and schools) and replacing sites that decline to participate in a peer reviewed article. Their study found evidence that both random and balanced site selection yields impact estimates with less external validity bias and lower mean squared error than purposive site selection, even when a large share of sites decline to participate (Litwok., Shivji, and Olsen 2023).
  • Software (an R package and two SAS macros) to implement different approaches (random and balanced site selection) to selecting and replacing sites were developed and released (https://www.generalizability.org/).

Statistical/Methodological Product: The research team provided results from testing the performance of different methods for selecting more representative samples of districts and schools for impact studies with a focus on (1) random site selection and (2) balanced site selection designed to minimize differences in characteristics between the sample and the population. They also developed and released software to support each type of site selection.

Development/Refinement Process: The study involved simulation research using data from the Common Core of Data and the Head Start Impact Study. From these data, repeated samples were selected using various site selection and replacement methods. The external validity bias of each site selection method was calculated by subtracting the population average treatment effect from the mean sample average treatment effect across the simulated samples.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

MathematicsData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote