Project Activities
People and institutions involved
IES program contact(s)
Products and publications
ERIC Citations: Find available citations in ERIC for this award here.
Project Website: https://www.generalizability.org/
Additional Online Resources and Information:
- https://cran.r-project.org/web/packages/sitepickR/index.html - an R package for selecting districts and schools using the cube method
- https://github.com/select-sites/exval - a SAS macro for stratifying a population frame of schools using cluster analysis and a SAS macro for selecting stratified random and stratified balanced samples of districts and schools
Select Publications:
Litwok, D., Nichols, A., Shivji, A., & Olsen, R. B. (2023). Selecting districts and schools for impact studies in education: A simulation study of different strategies. Journal of Research on Educational Effectiveness, 16(3), 501-531.
Project website:
Related projects
Supplemental information
Co-Principal Investigator: Bell, Stephen
- The research team published details on the external validity bias that results from different approaches to selecting sites (e.g., districts and schools) and replacing sites that decline to participate in a peer reviewed article. Their study found evidence that both random and balanced site selection yields impact estimates with less external validity bias and lower mean squared error than purposive site selection, even when a large share of sites decline to participate (Litwok., Shivji, and Olsen 2023).
- Software (an R package and two SAS macros) to implement different approaches (random and balanced site selection) to selecting and replacing sites were developed and released (https://www.generalizability.org/).
Statistical/Methodological Product: The research team provided results from testing the performance of different methods for selecting more representative samples of districts and schools for impact studies with a focus on (1) random site selection and (2) balanced site selection designed to minimize differences in characteristics between the sample and the population. They also developed and released software to support each type of site selection.
Development/Refinement Process: The study involved simulation research using data from the Common Core of Data and the Head Start Impact Study. From these data, repeated samples were selected using various site selection and replacement methods. The external validity bias of each site selection method was calculated by subtracting the population average treatment effect from the mean sample average treatment effect across the simulated samples.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.