Skip Navigation

Groundhog Day? Different test. Same story.

Mark Schneider, Director of IES | November 16, 2022

With all the attention garnered by the release of the steep declines in NAEP scores, another set of indicators of the condition of American education went far less remarked upon. I refer to the data contained in the ACT national profile report of the graduating class of 2022 released on October 12, 2022.

The ACT focuses on students at the end of their high school career and uses test scores as an indicator of both academic achievement and college readiness. This, of course, provides insights NAEP cannot for two reasons: first, NAEP does not test students in the 12th grade with the same frequency and with the same level of detail as it tests students in grades 4 and 8. Currently, 12th grade NAEP is administered every four years and reports results only at the national level. This minimal coverage persists despite complaints from many that this testing regime deprives the nation of needed information about what our graduating high school students know and can do. Second, despite decades of work, NAGB has never been able to turn the information we get from the 12th grade NAEP into a widely accepted indicator of college readiness.

Given this lacuna, ACT scores provide a valuable window into a critical transition point for our nation's students. Many of the most disturbing trends we have seen in NAEP are also evident in the ACT results. The data in Table 1 come from Figure 1 of ACT's national profile report.

Table 1: Five-year declines are evident in selected ACT measures
Year 2017–18 2018–19 2019–20 2020–21 2021–22
Average composite score* 20.8 20.7 20.6 20.3 19.8
Percent meeting 3 or 4 college readiness benchmarks 38% 37% 37% 36% 32%
Percent meeting STEM benchmarks 20% 20% 20% 19% 16%
Percent taking core curriculum** 63% 60% 56% 46% 47%

*The maximum on the ACT is 36.
**Four or more years of English and three or more years each of math, social studies, and natural science.

Over the last five years, we see a gradual erosion of average ACT composite scores, accelerating during the pandemic. This pattern is also clear in the percentage of students meeting ACT's overall college readiness benchmarks and those meeting ACT's STEM benchmarks. Equally disturbing is the sharp decline in students taking what the ACT defines as a "core curriculum." In the last two years, fewer than half of our high school students have pursued a rigorous high school curriculum. Some of these declines might be attributed to COVID, but year over year drops dating back to 2017 are evident in several indicators.

There are strong race/ethnicity differences. Table 2 reports the average 2021-22 ACT scores for the nation's three largest student race/ethnic groups (White, Black/African American, and Hispanic/Latino) for the key subjects of math and science as well as the composite ACT score and STEM score. (The report includes data for English and Reading subject areas. It also reports data for more student groups.)

Table 2: Average ACT Scores 2021-22 by Race/Ethnicity
Race/Ethnicity White Black/African American Hispanic/Latino
Mathematics 20.6 16 17.5
Science 21.3 16.5 18
Composite 21.3 16.1 17.7
STEM 21.2 16.5 18

These results reinforce what we have seen in NAEP's science and math assessments. Such racial/ethnic gaps in STEM subjects will handicap the nation's ability to produce a large and diverse STEM workforce, which in turn will likely affect the fate of our economy and society.

ACT scores are strongly associated with course-taking patterns—students who take a rigorous core curriculum do better on average than students who don't. For example, the average ACT composite score for students who took a core curriculum was 22 versus 19 for students who did not. Over half (57%) of White students reported having taken a core curriculum versus only 42% of Black/African American students and 39% of Hispanic/Latino students.

But there are large differences in academic performance even after considering completion of a core curriculum. For example, 52% of White students who pursued a core curriculum met 3 or 4 college readiness benchmarks. This proportion falls to just 15% of Black/African American students and 32% of Hispanic/Latino students who pursued a core curriculum. Clearly, Black/African American and Hispanic/Latino students are (a) less likely to have completed a rigorous curriculum and (b) even when they did, their education is not producing the same level of achievement as it does for White students. Researchers who have analyzed the Department's Civil Rights Data Collection have noted disparities in access to rigorous course offerings in ways that systematically disadvantage low-income students and students of color. We have seen similar course-taking patterns in NAEP's high school transcript studies, which has also documented that the curriculum delivered in schools is all too often a weak version of the rigorous titles assigned to courses students take.

The ACT differs from NAEP in some crucial ways: while NAEP is designed specifically for identifying what students know and can do (and how it compares to what students historically have known and could have done), the ACT is designed to help a subset of colleges make admissions decisions about college-aspiring students at a specific moment and time. As a result, compared to NAEP, the ACT is less likely to inform national dialogue and policy.

Further, while NAEP is administered to a nationally representative sample of students, the ACT is not. In some states, every high school student takes the ACT; in others, students opt in. Some selective colleges require the ACT or SAT; others have temporarily or permanently suspended the requirement, meaning that not all college-aspiring students are signing up. And the ACT itself has changed over the years to better meet the needs of colleges and students—for example allowing students to re-take individual sections of the ACT to try and get a better score beginning just a few years ago. NAEP, on the other hand, has remained steady to allow for easier comparisons over time.

But despite the differences, the story is dismal.

As IES observes its 20th anniversary, we are taking stock of all that's been accomplished in those years to identify interventions—and to support states and districts in using data and evidence—to improve student outcomes. But the ACT data, the NAEP data, the NWEA data, and many other indicators show far too many American students were already falling behind even before the pandemic. IES' mission is to find what works for whom under what conditions. We will continue the vigorous pursuit of that mission to help ensure that better news will accumulate over the coming years.

As always, I welcome your comments: