NCES Blog

National Center for Education Statistics

NAEP Opens a Pathway to Exploring Student Problem Solving in Assessments

A newly released NCES First Look report, Response Process Data From the NAEP 2017 Grade 8 Mathematics Assessment (NCES 2020-068), introduces a new type of data—response process data—which was made available after the NAEP reading and math assessments transitioned from paper to digitally based assessments in 2017. These new datasets will allow researchers to go beyond analyzing students’ answers to questions as simply right or wrong; instead, researchers will be able to examine the amount of time students spend on questions, the pathways they take through the assessment sections, and the tools they use while solving problems. The new First Look report provides an overview of data that will be available when the restricted-use data files for the 2017 grade 8 mathematics response process data are released later this summer.

NAEP reporting has hinted previously at the promise of response process data. With the release of the 2017 mathematics assessment results, NCES included a feature on The Nation’s Report Card website to show the different steps students took while responding to a question that assessed their multiplication skills. The short video below shows that students used a total of 397 different sequences to group four digits into two factors that yield a given product. The most popular correct and incorrect answer paths are shown in the video. Response process data, such as those summarized in this example item, can open new avenues for understanding how students work through math problems and identifying more detailed elements of response processes that could lead to common math errors.



The First Look report describes the forthcoming data files that will enable researchers to access student response process data from two 30-minute blocks of grade 8 mathematics assessment questions (or a total of 29 test items) and a 15-minute survey questionnaire where students responded to questions about their demographic characteristics, opportunities to learn in and outside of school, and educational experiences. Researchers will be able to explore logs of the response process data collected from each student along with a file containing students’ raw responses and scored responses, time stamps, and demographics. In addition, researchers can explore a file that summarizes defined features of students’ interactions with the assessment, such as the number of seconds spent on specific questions or the number of times the calculator was opened across all students.

To explore this response process dataset, interested researchers should apply for a restricted-use license and request access to the files through the NCES website. By providing this dataset to a wide variety of researchers, NCES hopes to encourage and enable a new domain of research on developing best practices for the use and interpretation of student response process data.

 

By Jan Marie Alegre and Robert Finnegan, Educational Testing Service

New International Comparisons of Reading, Mathematics, and Science Literacy Assessments

The Program for International Student Assessment (PISA) is a study of 15-year-old students’ performance in reading, mathematics, and science literacy that is conducted every 3 years. The PISA 2018 results provide us with a global view of U.S. students’ performance compared with their peers in nearly 80 countries and education systems. In PISA 2018, the major domain was reading literacy, although mathematics and science literacy were also assessed.

In 2018, the U.S. average score of 15-year-olds in reading literacy (505) was higher than the average score of the Organization for Economic Cooperation and Development (OECD) countries (487). Compared with the 76 other education systems with PISA 2018 reading literacy data, including both OECD and non-OECD countries, the U.S. average reading literacy score was lower than in 8 education systems, higher than in 57 education systems, and not measurably different in 11 education systems. The U.S. percentage of top performers in reading was larger than in 63 education systems, smaller than in 2 education systems, and not measurably different in 11 education systems. The average reading literacy score in 2018 (505) was not measurably different from the average score in 2000 (504), the first year PISA was administered. Among the 36 education systems that participated in both years, 10 education systems reported higher average reading literacy scores in 2018 compared with 2000, and 11 education systems reported lower scores.

The U.S. average score of 15-year-olds in mathematics literacy in 2018 (478) was lower than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 mathematics literacy data, the U.S. average mathematics literacy score was lower than in 30 education systems, higher than in 39 education systems, and not measurably different in 8 education systems. The average mathematics literacy score in 2018 (478) was not measurably different from the average score in 2003 (483), the earliest year with comparable data. Among the 36 education systems that participated in both years, 10 systems reported higher mathematics literacy scores in 2018 compared with 2003, 13 education systems reported lower scores, and 13 education systems reported no measurable changes in scores.  

The U.S. average score of 15-year-olds in science literacy (502) was higher than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 science literacy data, the U.S. average science literacy score was lower than in 11 education systems, higher than in 55 education systems, and not measurably different in 11 education systems. The average science literacy score in 2018 (502) was higher than the average score in 2006 (489), the earliest year with comparable data. Among the 52 education systems that participated in both years, 7 education systems reported higher average science literacy scores in 2018 compared with 2006, 22 education systems reported lower scores, and 23 education systems reported no measurable changes in scores.

PISA is conducted in the United States by NCES and is coordinated by OECD, an intergovernmental organization of industrialized countries. Further information about PISA can be found in the technical notes, questionnaires, list of participating OECD and non-OECD countries, released assessment items, and FAQs.

 

By Thomas Snyder

New Study on U.S. Eighth-Grade Students’ Computer Literacy

In the 21st-century global economy, computer literacy and skills are an important part of an education that prepares students to compete in the workplace. The results of a recent assessment show us how U.S. students compare to some of their international peers in the areas of computer information literacy and computational thinking.

In 2018, the U.S. participated for the first time in the International Computer and Information Literacy Study (ICILS), along with 13 other education systems around the globe. The ICILS is a computer-based international assessment of eighth-grade students that measures outcomes in two domains: computer and information literacy (CIL)[1] and computational thinking (CT).[2] It compares U.S. students’ skills and experiences using technology to those of students in other education systems and provides information on teachers’ experiences, school resources, and other factors that may influence students’ CIL and CT skills.

ICILS is sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and is conducted in the United States by the National Center for Education Statistics (NCES).

The newly released U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report provides information on how U.S. students performed on the assessment compared with students in other education systems and describes students’ and teachers’ experiences with computers.


U.S. Students’ Performance

In 2018, U.S. eighth-grade students’ average score in CIL was higher than the average of participating education systems[3] (figure 1), while the U.S. average score in CT was not measurably different from the average of participating education systems.

 


Figure 1. Average computer and information literacy (CIL) scores of eighth-grade students, by education system: 2018p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

⁵ Data collected at the beginning of the school year.

NOTE: The ICILS computer and information literacy (CIL) scale ranges from 100 to 700. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average CIL scores, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Given the importance of students’ home environments in developing CIL and CT skills (Fraillon et al. 2019), students were asked about how many computers (desktop or laptop) they had at home. In the United States, eighth-grade students with two or more computers at home performed better in both CIL and CT than their U.S. peers with fewer computers (figure 2). This pattern was also observed in all participating countries and education systems.

 


Figure 2. Average computational thinking (CT) scores of eighth-grade students, by student-reported number of computers at home and education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Met guidelines for sample participation rates only after replacement schools were included.

² National Defined Population covers 90 to 95 percent of National Target Population.

³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

⁴ Nearly met guidelines for sample participation rates after replacement schools were included.

NOTE: The ICILS computational thinking (CT) scale ranges from 100 to 700. The number of computers at home includes desktop and laptop computers. Students with fewer than two computers include students reporting having “none” or “one” computer. Students with two or more computers include students reporting having “two” or “three or more” computers. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average scores of students with two or more computers at home, from largest to smallest. Italics indicate the benchmarking participants.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

U.S. Students’ Technology Experiences

Among U.S. eighth-grade students, 72 percent reported using the Internet to do research in 2018, and 56 percent reported completing worksheets or exercises using information and communications technology (ICT)[4] every school day or at least once a week. Both of these percentages were higher than the respective ICILS averages (figure 3). The learning activities least frequently reported by U.S eighth-grade students were using coding software to complete assignments (15 percent) and making video or audio productions (13 percent).

 


Figure 3. Percentage of eighth-grade students who reported using information and communications technology (ICT) every school day or at least once a week, by activity: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.

NOTE: The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Activities are ordered by the percentages of U.S. students reporting using information and communications technology (ICT) for the activities, from largest to smallest.

SOURCE: International Association for the Evaluation of Educational Achievement (IEA), the International Computer and Information Literacy Study (ICILS), 2018.


 

Browse the full U.S. Results from the 2018 International Computer and Information Literacy Study (ICILS) web report to learn more about how U.S. students compare with their international peers in their computer literacy skills and experiences.

 

By Yan Wang, AIR, and Linda Hamilton, NCES

 

[1] CIL refers to “an individual's ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society” (Fraillon et al. 2019).

[2] CT refers to “an individual’s ability to recognize aspects of real-world problems which are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer” (Fraillon et al. 2019). CT was an optional component in 2018. Nine out of 14 ICILS countries participated in CT in 2018.

[3] U.S. results are not included in the ICILS international average because the U.S. school level response rate of 77 percent was below the international requirement for a participation rate of 85 percent.

[4] Information and communications technology (ICT) can refer to desktop computers, notebook or laptop computers, netbook computers, tablet devices, or smartphones (except when being used for talking and texting).

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Duckworth, D., and Friedman, T. (2019). IEA International Computer and Information Literacy Study 2018: Assessment Framework. Cham, Switzerland: Springer. Retrieved October 7, 2019, from https://link.springer.com/book/10.1007%2F978-3-030-19389-8.