Project Activities
Structured Abstract
Setting
Sample
Research design and methods
Control condition
Key measures
Data analytic strategy
Key outcomes
People and institutions involved
IES program contact(s)
Products and publications
ERIC Citations: Find available citations in ERIC for this award here.
Select Publications:
Bush-Mecenas, S., Schweig, J.D, Kuhfeld, M., Mariano, L. & Diliberti, M. (2023). Research, interrupted: Addressing practical and methodological challenges under turbulent conditions. Santa Monica, CA: RAND Corporation. WRA1037-1.
Kuhfeld, M., Diliberti, M., McEachin, A. Schweig, J.D. & Mariano, L.T. (2023). Typical learning for whom? Guidelines for selecting benchmarks to calculate months of learning. NWEA Research.
Schweig, J., Kuhfeld, M., Diliberti, M., McEachin, A., & Mariano, L. (2022b). Changes in School Composition during the COVID-19 Pandemic: Implications for School-Average Interim Test Score Use. RAND Research Report. RR-A1037-2.
Schweig, J., McEachin, A. & Kuhfeld, M. (December 16, 2020). Addressing COVID-19's Disruption of Student Assessment, Inside IES Research Blog.
Schweig, J., McEachin, A., Kuhfeld, M., Mariano, L., & Diliberti, M. (2021). Adapting Course Placement Processes in Response to COVID-19 Disruptions: Guidance for Schools and District. RAND Research Report. RR-A1037-1.
Schweig, J., McEachin, A., Kuhfeld, M., Mariano, L., & Diliberti, M. (2022a). Allocating resources for COVID-19 recovery: A comparison of three indicators of school need, Educational Assessment, 27(2), 152-169.
Supplemental information
Co-Principal Investigators: Kuhfeld, Megan; McEachin, Andrew; Mariano, Louis
- Day-to-day instructional and other student related decisions
- School and teacher accountability systems
- Applied education research and program evaluation
- Missing assessment data complicates course placement processes. Schools and systems used a variety of approaches to deal with missing assessment data (e.g., simple replacement, multiple replacement, and regression), and this typically plays a key role in course placement decisions, including imputation, and score prediction.
- Consistent course placement decisions could be made using the three replacement strategies, although much depends on the district context.
- Due to variation in school quality, assuming average school quality when using regression-based methods can either overestimate or underestimate some students' future achievement, and this misestimation is problematic for course placement decisions.
- There is evidence of differential method performance based on student race and ethnicity and school poverty, and this can affect the regression-based results.
Regarding school and teacher accountability systems including incorporating COVID-19 vulnerability information in resource allocation decisions and how changes in school composition during the pandemic can influence school-level accountability measures (Schweig et al. 2022a, 2022b):
- School poverty more strongly predicts performance and progress during the pandemic than pre-COVID-19 academic measures.
- In elementary schools, pandemic vulnerability independently predicts performance and progress even when conditioning on poverty and pre-pandemic achievement.
- Of the indicators of poverty, the percentage of free and reduced-price lunch-eligible students is the strongest predictor of performance and progress during the pandemic.
- Within and among districts, there was wide variability in the percentage of students who attended the same schools and participated in MAP Growth assessments over 2 academic years.
- Participation in MAP Growth assessments was uneven in 2020â2021. In particular, students of color were less likely to have attended the same schools and participated in MAP Growth assessments over 2 academic years than were White students.
- Historically higher achieving students who participated in assessments in a given year were generally more likely than their peers to have attended the same schools and participated in MAP Growth assessments over 2 academic years.
- Schools serving high-poverty communities and communities vulnerable to COVID-19 had systematically fewer students attend the same school and participate in MAP Growth assessments over 2 academic years than other schools.
Regarding applied education research and program evaluation (Bush-Mecenas et al. 2023):
- Researchers described three practical challenges in conducting efficacy research during the pandemic period: (1) issues with intervention feasibility caused by situational complexity, (2) difficulty with study recruitment, and (3) issues with data availability and concerns about data quality.
- Researchers needed to modify study timelines.
- Researchers addressed recruitment challenges by focusing on partnerships and allocating funding to support staffing and incentives.
- Researchers struggled to strike a balance between the evaluations that were intended and those that could realistically be accomplished.
- Researchers often made pivots and adaptations that addressed threats to the internal validity of their studies. These pivots and adaptations also had the unintended consequence of raising other threats to validity.
- Concerns about generalizability and extrapolation were less of a priority for researchers during the pandemic.
Issue Examined: The COVID-19 pandemic disrupted state assessment programs, presenting methodological and decision-making challenges for applied education researchers, policymakers, and school administrators, who typically rely on such data to monitor student, school, and program progress and performance. The project team examined analytical methods being used to address the interruption in the assessment data when testing was not carried out.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.