NCES Blog

National Center for Education Statistics

New International Data Show Large and Widening Gaps Between High- and Low-Performing U.S. 4th- and 8th-Graders in Mathematics and Science

NCES recently released results from the 2019 Trends in International Mathematics and Science Study (TIMSS). TIMSS tests students in grades 4 and 8 in mathematics and science every 4 years. The results show that

  • Across both subjects and grades, the United States scored, on average, in the top quarter of the education systems that took part in TIMSS 2019.
    • Among the 64 education systems that participated at grade 4, the United States ranked 15th and 8th in average mathematics and science scores, respectively.
    • Among the 46 education systems that participated at grade 8, the United States ranked 11th in average scores for both subjects.
  • On average, U.S. scores did not change significantly between the 2011 and 2019 rounds of TIMSS.

Average scores are one measure of achievement in national and international studies. However, they provide a very narrow perspective on student performance. One way to look more broadly is to examine differences in scores (or “score gaps”) between high-performing students and low-performing students. Score gaps between high performers and low performers can be one indication of equity within an education system. Here, high performers are those who scored in the 90th percentile (or top 10 percent) within their education system, and low performers are those who scored in the 10th percentile (or bottom 10 percent) within their education system.

In 2019, while some education systems had a higher average TIMSS score than the United States, none of these education systems had a wider score gap between their high and low performers than the United States. This was true across both subjects and grades.

Figure 1 shows an example of these findings using the grade 8 mathematics data. The figure shows that 17 education systems had average scores that were higher or not statistically different from the U.S. average score.

  • Of these 17 education systems, 13 had smaller score gaps between their high and low performers than the United States. The score gaps in 4 education systems (Singapore, Chinese Taipei, the Republic of Korea, and Israel) were not statistically different from the score gap in the United States.
  • The score gaps between the high and low performers in these 17 education systems ranged from 170 points in Quebec, Canada, to 259 points in Israel. The U.S. score gap was 256 points.
  • If you are interested in the range in the score gaps for all 46 education systems in the TIMSS 2019 grade 8 mathematics assessment, see Figure M2b of the TIMSS 2019 U.S. Highlights Web Report, released in December 2020. This report also includes these results for grade 8 science and both subjects at the grade 4 level.

Figure 1. Average scores and 90th to 10th percentile score gaps of grade 8 students on the TIMSS mathematics scale, by education system: 2019

NOTE: This figure presents only those education systems whose average scores were similar to or higher than the U.S. average score. Scores are reported on a scale of 0 to 1,000 with a TIMSS centerpoint of 500 and standard deviation of 100.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS), 2019.


From 2011 to 2019, U.S. average scores did not change significantly. However, the scores of low performers decreased, and score gaps between low and high performers grew wider in both subjects and grades. In addition, at grade 8, there was an increase in the scores of high performers in mathematics and science over the same period. These two changes contributed to the widening gaps at grade 8.

Figure 2 shows these results for the U.S. grade 8 mathematics data. Average scores in 2011 and 2019 were not significantly different. However, the score of high performers increased from 607 to 642 points between 2011 and 2019, while the score of low performers decreased from 409 to 385 points. As a result, the score gap widened from 198 to 256 points between 2011 and 2019. In addition, the 2019 score gap for grade 8 mathematics is significantly wider than the gaps for all previous administrations of TIMSS.


Figure 2. Trends in average scores and selected percentile scores of U.S. grade 8 students on the TIMSS mathematics scale: Selected years, 1995 to 2019

* p < .05. Significantly different from the 2019 estimate at the .05 level of statistical significance.

NOTE: Scores are reported on a scale of 0 to 1,000 with a TIMSS centerpoint of 500 and standard deviation of 100.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), Trends in International Mathematics and Science Study (TIMSS), 1995, 1999, 2003, 2007, 2011, 2015, 2019.


These TIMSS findings provide insights regarding equity within the U.S. and other education systems. Similar results from the National Assessment of Educational Progress (NAEP) show that mathematics scores at both grades 4 and 8 decreased or did not change significantly between 2009 and 2019 for lower performing students, while scores increased for higher performing students. More national and international research on the gap between high- and low-performing students could help inform important education policy decisions that aim to address these growing performance gaps.

To learn more about TIMSS and the 2019 U.S. and international results, check out the TIMSS 2019 U.S. Highlights Web Report and the TIMSS 2019 International Results in Mathematics and Science. Registration is also open for an upcoming RISE Webinar on February 24, 2021 (What Do TIMSS and NAEP Tell Us About Gaps Between High- and Low-Performing 4th and 8th Graders?) that explores these topics further.  

 

By Katie Herz, AIR; Marissa Hall, AIR; and Lydia Malley, NCES

Due to COVID Pandemic, NCES to Delay National Assessment of Educational Progress (NAEP) Assessment

Due to the impact of the COVID pandemic on school operations, it will not be possible for NCES to conduct the National Assessment of Educational Progress (NAEP) assessments in accordance with the statutory requirements defined by the Education Sciences Reform Act (ESRA) which requires NAEP to be conducted in a valid and reliable manner every 2 years (20 U.S.C. 9622(b)(2)(B)).

NCES has been carefully monitoring physical attendance patterns in schools across the county. I have determined that NCES cannot at this time conduct a national-level assessment (20 U.S.C. 9622(b)(2)(A)) in a manner with sufficient validity and reliability to meet the mandate of the law. Too many students are receiving their education through distance learning or are physically attending schools in locations where outside visitors to the schools are being kept at a minimum due to COVID levels. The NAEP assessments are a key indicator of educational progress in the United States with trends going back decades. The change in operations and lack of access to students to be assessed means that NAEP will not be able to produce estimates of what students know and can do that would be comparable to either past or future national or state estimates.




As Commissioner for Education Statistics, I feel it would be in the best interests of the country and keeping with the intent of ESRA (20 U.S.C. 9622(b)(2)(B)) to postpone the next NAEP collection to 2022. By postponing the collection, we are allowing time for conditions on the ground to stabilize before attempting a large-scale national assessment. Further, if we attempted to move forward with a collection in 2021 and failed to produce estimates of student performance, we would not only have spent tens of millions of dollars, but also will not by law be able to conduct the next grades four and eight reading and mathematics assessments until 2023. By postponing to 2022, we will be more likely to get reliable national and state NAEP results closer to the statutorily prescribed timeline than if we attempt and fail to collect the data in 2021.

Additionally, delaying the next NAEP assessment to early 2022 will reduce the burden this year on schools, allowing time for the states to conduct their own state assessments this spring. To create comparable results, NAEP is conducted during the same time window across the country each time it is given. This was impractical as COVID infection rates differ greatly from state to state during any one time. NAEP also uses shared equipment and outside proctors who go into the schools to ensure a consistent assessment experience across the nation. I was obviously concerned about sending outsiders into schools and possibly increasing the risk of COVID transmission.

State assessments, however, generally use existing school staff and equipment; thus, eliminating this additional risk associated with NAEP. Therefore, while having nationally comparable NAEP data to estimate the impact of the COVID pandemic on educational progress would be ideal but impossible, there is still an opportunity to get solid state-by-state data on the impact of COVID on student outcomes. This state-level data can serve as a bridge until Spring 2022 when NCES will likely be able to conduct the national NAEP assessment in a manner that has sufficient validity and reliability. 

 

By James L. Woodworth, NCES Commissioner

NCES Releases First-Ever Response Process Dataset—A Rich New Resource for Researchers

The NCES data file National Assessment of Educational Progress (NAEP) Response Process Data From the 2017 Grade 8 Mathematics Assessment (NCES 2020-102; documentation NCES 2020-134) introduces a new type of data—response process data—which was made possible by NAEP’s transition from paper to digitally based assessments in mathematics and reading in 2017. These new datasets allow researchers to go beyond analyzing students’ answers to questions as simply right or wrong; instead, researchers can examine the amount of time students spend on questions, the pathways they take through the assessment sections, and the tools they use while solving problems. 

NAEP reporting has hinted previously at the promise of response process data. With the release of the 2017 mathematics assessment results, NCES included a feature on The Nation’s Report Card website to show the different steps students took while responding to a question that assessed their multiplication skills. The short video below shows that students used a total of 397 different sequences to group four digits into two factors that yield a given product. The most popular correct and incorrect answer paths are shown in the video. Response process data, such as those summarized in this example item, can open new avenues for understanding how students work through math problems and identifying more detailed elements of response processes that could lead to common math errors.



In the newly released data, researchers can access student response process data from two 30-minute blocks of grade 8 mathematics assessment questions (or a total of 29 test items) and a 15-minute survey questionnaire where students responded to questions about their demographic characteristics, opportunities to learn in and outside of school, and educational experiences. Researchers can explore logs of the response process data collected from each student along with a file containing students’ raw responses and scored responses, time stamps, and demographics. In addition, researchers can explore a file that summarizes defined features of students’ interactions with the assessment, such as the number of seconds spent on specific questions or the number of times the calculator was opened across all students.

To explore this response process dataset, interested researchers should apply for a restricted-use license and request access to the files through the NCES website. By providing this dataset to a wide variety of researchers, NCES hopes to encourage and enable a new domain of research on developing best practices for the use and interpretation of student response process data.

 

By Jan Marie Alegre and Robert Finnegan, Educational Testing Service

What National and International Assessments Can Tell Us About Technology in Students’ Learning: Technology Instruction, Use, and Resources in U.S. Schools

As schools and school districts plan instruction amid the current coronavirus pandemic, the use of technology and digital resources for student instruction is a key consideration.

In this post, the final in a three-part series, we present results from the NAEP TEL and ICILS educator questionnaires (see the first post for information about the results of the two assessments and the second post for the results of the student questionnaires). The questionnaires ask about the focus of technology instruction in schools, school resources to support technology instruction, and the use of technology in teaching practices.

It is important to note that NAEP TEL surveys the principals of U.S. eighth-grade students, while ICILS surveys a nationally representative sample of U.S. eighth-grade teachers.

Emphasis in technology instruction

According to the 2018 NAEP TEL principal questionnaire results, principals1 of 61 percent of U.S. eighth-grade students reported that prior to or in eighth grade, much of the emphasis in information and communication technologies (ICT) instruction was placed on teaching students how to collaborate with others. In addition, principals of 51 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to find information or data to solve a problem. In comparison, principals of only 10 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to run simulations (figure 1).



According to the 2018 ICILS teacher questionnaire results, 40 percent of U.S. eighth-grade teachers reported a strong emphasis on the use of ICT instruction to develop students’ capacities to use computer software to construct digital work products (e.g., presentations). In addition, 35 percent of eighth-grade teachers reported a strong emphasis on building students’ capacities to access online information efficiently. In comparison, 17 percent reported a strong emphasis on developing students’ capacities to provide digital feedback on the work of others (figure 2).  



Resources at school

NAEP TEL and ICILS used different approaches to collect information about technology-related school resources. NAEP TEL asked about hindrances that limited schools’ capabilities to provide instruction in technology or engineering concepts. According to NAEP TEL, principals of 5 percent of U.S. eighth-grade students indicated that a lack or inadequacy of internet connectivity was a “moderate” or “large” hindrance in their schools. However, principals of 61 percent of eighth-grade students indicated that a lack of time due to curriculum content demands was a “moderate” or “large” hindrance. Principals of 44 percent of eighth-grade students indicated that a lack of qualified teachers was a “moderate” or “large” hindrance (figure 3).



ICILS asked about the adequacy of school resources to support ICT use in teaching. Eighty-six percent of U.S. teachers “agreed” or “strongly agreed” that technology was considered a priority for use in teaching. Nearly three-quarters of teachers “agreed” or “strongly agreed” that their schools had access to sufficient digital learning resources and had good internet connectivity (74 and 73 percent, respectively) (figure 4).



Use of technology in teaching

Teachers of U.S. eighth-grade students reported that they often used technology in their teaching practices. ICILS found that 64 percent of U.S. teachers regularly (i.e., “often” or “always”) used technology to present class instruction. Fifty-four percent of teachers regularly used technology to communicate with parents or guardians about students’ learning. In addition, 45 percent of teachers regularly used technology to provide remedial or enrichment support to individual or small groups of students, and a similar percentage (44 percent) regularly used technology to reinforce skills through repetition of examples (figure 5).



ICILS also reported results from U.S. eighth-grade teachers about how they collaborated on technology use. About three-quarters “agreed” or “strongly agreed” that they talked to other teachers about how to use technology in their teaching. Similarly, about three-quarters “agreed” or “strongly agreed” that they shared technology resources with other teachers in the school. More than half of the teachers “agreed” or “strongly agreed” that they collaborated with colleagues on the development of technology-based lessons.

Overall, the responses of teachers and principals suggested that emphasis had been put on different aspects of instruction for eighth-grade students. The majority of schools had enough digital resources and adequate internet access. However, technologies were also used differently in different teaching practices.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools. The 2018 ICILS also included a principal questionnaire, but the questions are not directly related to the topics included in this blog. Data reported in the text and figures are rounded to the nearest integer.

 

Resources for more information:

 

By Yan Wang, AIR, and Taslima Rahman, NCES


[1] The unit of analysis for TEL principal responses is student.

What National and International Assessments Can Tell Us About Technology in Students’ Learning: Eighth-Graders’ Experience with Technology

The use of technology has become an integral part of life at work, at school, and at home throughout the 21st century and, in particular, during the coronavirus pandemic.

In this post, the second in a three-part series, we present results from the NAEP TEL and ICILS student questionnaires about students’ experience and confidence using technology (see the first post for more information about these assessments and their results). These results can help to inform education systems that are implementing remote learning activities this school year.

Uses of information and communication technologies (ICT) for school

Both NAEP TEL and ICILS collected data in 2018 on U.S. eighth-grade students’ uses of ICT in school or for school-related purposes.

According to the NAEP TEL questionnaire results, about one-third of U.S. eighth-grade students reported that they used ICT regularly (i.e., at least once a week) to create, edit, or organize digital media (figure 1). About a quarter used ICT regularly to create presentations, and 18 percent used ICT regularly to create spreadsheets.



According to the ICILS questionnaire results, 72 percent of U.S. eighth-grade students reported that they regularly used the Internet to do research, and 56 percent regularly used ICT to complete worksheets or exercises (figure 2). Forty percent of eighth-grade students regularly used ICT to organize their time and work. One-third regularly used software or applications to learn skills or a subject, and 30 percent regularly used ICT to work online with other students.



Confidence in using ICT

Both the 2018 NAEP TEL and ICILS questionnaires asked U.S. eighth-grade students about their confidence in their ICT skills. NAEP TEL found that about three-quarters of eighth-grade students reported that they were confident that they could—that is, they reported that they “probably can” or “definitely can”—compare products using the Internet or create presentations with sound, pictures, or video (figure 3). Seventy percent were confident that they could organize information into a chart, graph, or spreadsheet.



ICILS found that 86 percent of U.S. eighth-grade students reported that they knew how to search for and find relevant information for a school project on the Internet (figure 4). Eighty-three percent knew how to both upload text, images, or video to an online profile and install a program or app. About three-quarters of eighth-grade students knew how to change the settings on their devices, and 65 percent knew how to edit digital photographs or other graphic images.



Years of experience using computers

In the 2018 ICILS questionnaire, U.S. eighth-grade students were also asked how many years they had been using desktop or laptop computers. One-third of eighth-grade students reported using computers for 7 years or more—that is, they had been using computers since first grade (figure 5). This finding was similar to results from the Computer Access and Familiarity Study (CAFS), which was conducted as part of the 2015 NAEP. The CAFS found that in 2015, about 35 percent of eighth-grade public school students reported first using a laptop or desktop computer in kindergarten or before kindergarten.

Nineteen percent of eighth-grade students reported that they had used computers for at least 5 but less than 7 years. However, 9 percent of eighth-grade students had never used computers or had used them for less than one year, meaning they had only started using computers when they reached eighth grade.



Overall, responses of eighth-grade students showed that some had more years of experience using computers than others. Although there were differences in students’ use of ICT for school-related purposes, most students felt confident using ICT.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic or other factors are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools. Data reported in the text and figures are rounded to the nearest integer.

 

Resources for more information:

 

By Yan Wang, AIR, and Taslima Rahman, NCES