Skip to main content

Breadcrumb

Home arrow_forward_ios Improving School Accountability Mea ...
Home arrow_forward_ios Improving School Accountability Mea ...
Blog

Improving School Accountability Measures in the Wake of COVID-19: An Opportunity Borne of Necessity

REL Mid-Atlantic
June 21, 2022
By: Brian Gill
Person holding out a measuring tape

School accountability requirements under the Every Student Succeeds Act (ESSA) are back in force in 2022 after a two-year hiatus caused by the pandemic. State assessments were canceled everywhere when COVID-19 hit in 2020. Although the tests were (mostly) back in place by 2021, participation varied widely, and states were not required to hold schools accountable for the results. Federal waivers for both testing and accountability have now ended, and states across the country must identify low-performing schools this fall based on test results and other data from the 2021-2022 school year.

But even if test administration was back to normal in spring 2022, reconsidering the details of accountability systems in the wake of the pandemic may be wise--and in many states may be unavoidable. For one thing, nearly all these systems include measures of student growth or value added that examine student progress from one year to the next--which isn't possible if 2021 assessments can't be used as a baseline. More broadly, the testing disruptions caused by the pandemic could create additional instability in school-performance indicators, making them difficult to interpret.

Pennsylvania's effort to reduce random error in school performance measures is groundbreaking--and could be a powerful way for states across the country to improve accuracy in their accountability determinations.

To address these and other challenges, a group of state agencies have joined a community of practice on accountability measures hosted by REL Mid-Atlantic. One participating state--Pennsylvania--astutely recognized that these challenges offered an opportunity to improve the validity and reliability of the indicators used in its accountability systems over the long term. We've started working with the Pennsylvania Department of Education (PDE) on the challenge of stabilizing indicators of subgroup performance that are used in identifying schools for targeted support and improvement (TSI) or additional targeted support and improvement (ATSI).

Pennsylvania wants to make sure it does not identify schools for improvement based on data that are unreliable because of random variation. Unstable performance measures can be especially problematic in identifying low-performing schools, which might only appear to be low performing based on bad luck rather than true performance. This issue is exacerbated when the number of students included is small--such as for student subgroups, which are critical in identifying schools for targeted support and improvement under ESSA, not only in Pennsylvania but in every state. Recognizing this problem, PDE asked the REL to help find ways to stabilize some of its performance measures and reduce the risk of erroneously putting a school in improvement status. We are partnering with PDE to explore the possible use of state-of-the-art Bayesian statistical methods to reduce random error and stabilize performance measures for subgroups, which could help ensure that a school is not placed in improvement status based on a statistical fluke.

This effort to reduce random error in school performance measures is groundbreaking and could be a powerful way for states across the country to improve accuracy in accountability judgments in 2022 and beyond. Moreover, increasing the reliability of school performance indicators not only produces immediate improvement in the accuracy of accountability determinations, but also might increase the credibility of the accountability system over the long term, as educators come to see that the measures are not subject to large random swings from one year to the next.

As you might guess, we are very excited about the pathbreaking work that Pennsylvania is undertaking. We look forward to telling you more about the results in the months to come. In the meantime, feel free to get in touch if your state (or district or charter authorizer) would like to join our accountability community of practice.

Tags

Academic AchievementCovid-19Data and Assessments

Meet the Author

Brian Gill

Brian Gill

Director for REL Mid-Atlantic

Related blogs

Happy New Year from the ECLS-K: 2024!

January 07, 2025 by Jill McCarroll

Program Update: ED/IES SBIR Announces the Opening of its 2025 Program Solicitations and Recaps its 2024 Awards

November 08, 2024 by IES Staff

Card Activity Helps Educators Sort Out Types of Assessment

November 08, 2024 by Jill Marcus, Pamela Buffington

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote