IES Blog

Institute of Education Sciences

What National and International Assessments Can Tell Us About Technology in Students’ Learning: Eighth-Graders’ Experience with Technology

The use of technology has become an integral part of life at work, at school, and at home throughout the 21st century and, in particular, during the coronavirus pandemic.

In this post, the second in a three-part series, we present results from the NAEP TEL and ICILS student questionnaires about students’ experience and confidence using technology (see the first post for more information about these assessments and their results). These results can help to inform education systems that are implementing remote learning activities this school year.

Uses of information and communication technologies (ICT) for school

Both NAEP TEL and ICILS collected data in 2018 on U.S. eighth-grade students’ uses of ICT in school or for school-related purposes.

According to the NAEP TEL questionnaire results, about one-third of U.S. eighth-grade students reported that they used ICT regularly (i.e., at least once a week) to create, edit, or organize digital media (figure 1). About a quarter used ICT regularly to create presentations, and 18 percent used ICT regularly to create spreadsheets.



According to the ICILS questionnaire results, 72 percent of U.S. eighth-grade students reported that they regularly used the Internet to do research, and 56 percent regularly used ICT to complete worksheets or exercises (figure 2). Forty percent of eighth-grade students regularly used ICT to organize their time and work. One-third regularly used software or applications to learn skills or a subject, and 30 percent regularly used ICT to work online with other students.



Confidence in using ICT

Both the 2018 NAEP TEL and ICILS questionnaires asked U.S. eighth-grade students about their confidence in their ICT skills. NAEP TEL found that about three-quarters of eighth-grade students reported that they were confident that they could—that is, they reported that they “probably can” or “definitely can”—compare products using the Internet or create presentations with sound, pictures, or video (figure 3). Seventy percent were confident that they could organize information into a chart, graph, or spreadsheet.



ICILS found that 86 percent of U.S. eighth-grade students reported that they knew how to search for and find relevant information for a school project on the Internet (figure 4). Eighty-three percent knew how to both upload text, images, or video to an online profile and install a program or app. About three-quarters of eighth-grade students knew how to change the settings on their devices, and 65 percent knew how to edit digital photographs or other graphic images.



Years of experience using computers

In the 2018 ICILS questionnaire, U.S. eighth-grade students were also asked how many years they had been using desktop or laptop computers. One-third of eighth-grade students reported using computers for 7 years or more—that is, they had been using computers since first grade (figure 5). This finding was similar to results from the Computer Access and Familiarity Study (CAFS), which was conducted as part of the 2015 NAEP. The CAFS found that in 2015, about 35 percent of eighth-grade public school students reported first using a laptop or desktop computer in kindergarten or before kindergarten.

Nineteen percent of eighth-grade students reported that they had used computers for at least 5 but less than 7 years. However, 9 percent of eighth-grade students had never used computers or had used them for less than one year, meaning they had only started using computers when they reached eighth grade.



Overall, responses of eighth-grade students showed that some had more years of experience using computers than others. Although there were differences in students’ use of ICT for school-related purposes, most students felt confident using ICT.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic or other factors are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools. Data reported in the text and figures are rounded to the nearest integer.

 

Resources for more information:

 

By Yan Wang, AIR, and Taslima Rahman, NCES

What National and International Assessments Can Tell Us About Technology in Students’ Learning: Eighth-Graders’ Readiness to Use Technology

Across the country in 2020, students, teachers, and parents have had to adapt to changes in the delivery of education instruction due to the coronavirus pandemic and turn to information and communication technologies (ICT) to learn, interact, and assess progress. It is important that we are able to assess students’ abilities to understand and use technology now more than ever. This post, the first in a three-part series, discusses the results of two technology-focused assessments, NAEP TEL and ICILS, which NCES administered in 2018 (see textbox for more information about these assessments).

Students’ performance on NAEP TEL and ICILS

According to the 2018 NAEP TEL, 46 percent of U.S. eighth-grade students scored at or above the NAEP Proficient level, meaning that they were able to demonstrate the selection and use of an appropriate range of tools and media (figure 1). According to the 2018 ICILS, 25 percent of eighth-grade students scored at or above proficiency level 3 for computer and information literacy—that is, they demonstrated the capacity to work independently when using computers as information-gathering and management tools. In addition, 20 percent of eighth-grade students scored in the upper region for computational thinking, meaning that they demonstrated an understanding of computation as a problem-solving framework.


Figure 1. Percentage of eighth-grade students identified as at or above proficient, by assessment: 2018


There are many factors that may affect performance on these assessments, such as access to technology, devices, hardware, and software; access to learning opportunities using technology; amount of experience using technology; and attitudes toward technology.

Student’s participation in technology-related classes

For NAEP TEL, students were asked if they were currently taking a technology-related class or if they had taken one in the past. In 2018, about 57 percent of U.S. eighth-grade students were either currently enrolled in or had taken at least one technology-related class, such as industrial technology, engineering, or a class that involved learning to use, program, or build computers (table 1). In addition, a higher percentage of students who had completed a technology-related class before or during eighth-grade scored at or above the NAEP Proficient level, compared with students who had not completed such a class (table 2).




Overall, both assessments show that a portion of students demonstrated the knowledge and skills necessary to meet the assessment’s defined proficiency measures, although more than half reported taking technology-related classes before or during eighth grade.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic or other factors are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools, but the two assessments use different methods for reporting student performance. Data from these assessments do not support any causal inferences as they are not experimental studies. Data reported in the text and figures are rounded to the nearest integer.

 

Resources for more information:

By Mary Ann Fox, AIR, and Taslima Rahman, NCES

Teaching with Technology: U.S. Teachers’ Perceptions and Use of Digital Technology in an International Context

The coronavirus pandemic forced teachers across the world to immediately transition instruction to a virtual setting in early 2020. To understand U.S. teachers’ level of preparedness for this shift in an international context, this blog examines recent international data from U.S. teachers’ responses to questions on the following topics:

  • Their perceptions of information and communications technologies (ICT) resources
  • Their use of ICT for instruction prior to the pandemic

In general, the results suggest that U.S. teachers are more resourced in ICT than their international peers, and they use ICT at a similar frequency at school when teaching.

 

Teachers’ perceptions of ICT resources at their school

The quantity and quality of ICT resources available in school systems prior to the coronavirus pandemic may impact teachers’ access to such resources for instructional purposes while classrooms are functioning in a virtual format. The United States participated in the 2018 International Computer and Information Literacy Study (ICILS), which asked questions about ICT resources to a nationally representative sample of eighth-grade teachers from 14 education systems.

The results from this study show that 86 percent of eighth-grade teachers both in the United States and across ICILS 2018 education systems “strongly agreed” or “agreed” that ICT is considered a priority for use in teaching (figure 1). Compared with the ICILS 2018
averages,[1] higher percentages of U.S. eighth-grade teachers “strongly agreed” or “agreed” with various statements about the use of ICT.

While 86 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “ICT is considered a priority for use in teaching,” only 61 percent “strongly agreed” or “agreed” that “there is sufficient opportunity for me to develop expertise in ICT” (figure 1). Additionally, 62 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “there is enough time to prepare lessons that incorporate ICT.” These disparities may have had an impact on teacher capacity during the sudden shift to 100 percent online learning as a result of the coronavirus pandemic, which would be a good topic for future research and analyses.  


Figure 1. Percentage of eighth-grade teachers who reported that they “strongly agree” or “agree” with statements about using ICT in teaching at school, by statement: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
² National Defined Population covers 90 to 95 percent of National Target Population.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Statements are ordered by the percentages of U.S. teachers reporting “strongly agree” or “agree” from largest to smallest.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 17 from U.S. Results from the 2018 ICILS Web Report.


Teachers’ perceptions of the use of ICT for instruction

Teachers’ views on the role of ICT in virtual instruction during the coronavirus pandemic are not yet clear. However, in 2018, when instruction was conducted in physical classrooms, most U.S. eighth-grade teachers participating in ICILS expressed positive perceptions about “using ICT in teaching and learning at school,” as did many teachers internationally.

Among eighth-grade teachers in the United States, 95 percent agreed that ICT “enables students to access better sources of information,” 92 percent agreed that ICT “helps students develop greater interest in learning,” and 92 percent agreed that ICT “helps students work at a level appropriate to their learning needs.” On average across other education systems participating in ICILS, at least 85 percent of teachers agreed with each of these statements (Fraillon et al. 2019).

Seventy-five percent of U.S. eighth-grade teachers in 2018 agreed that ICT “improves academic performance of students,” which was higher than the ICILS international average of 71 percent. The percentages of teachers who agreed with this statement varied across education systems, from three-quarters or more of teachers in Chile, Denmark, Kazakhstan, and Portugal to less than half of teachers in Finland and North Rhine-Westphalia (Germany).

 

Frequency of teachers’ use of ICT

Teachers’ reported use of ICT for instruction in physical classroom settings may provide insight into their level of experience as they transition to virtual settings during the coronavirus pandemic.

In 2018, half of U.S. eighth-grade teachers reported “using ICT at school when teaching” every day, which was not significantly different from the ICILS average of 48 percent. However, the U.S. percentage was lower than the percentages of teachers in Moscow (76 percent), Denmark (72 percent), and Finland (57 percent) (figure 2).


Figure 2. Percentage of eighth-grade teachers who reported using ICT at school every day when teaching, by education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.
¹ Met guidelines for sample participation rates only after replacement schools were included.
² National Defined Population covers 90 to 95 percent of National Target Population.
³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
⁴ Data collected at the beginning of the school year.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their percentages of teachers reporting using ICT at school when teaching from largest to smallest. Italics indicate the benchmarking participants.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 15 from U.S. Results from the 2018 ICILS Web Report.


For more information on teachers and technology, check out NCES’s ICILS 2018 website, the international ICILS website, and the earlier NCES blog “New Study on U.S. Eighth-Grade Students’ Computer Literacy.”

 

By Amy Rathbun, AIR, and Stephen Provasnik, NCES

 


[1] The ICILS average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. The United States did not meet the guidelines for a sample participation rate of 85 percent, so it is not included in the international average.

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Friedman, T., and Duckworth, D. (2019). Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 International Report. Amsterdam: International Association for the Evaluation of Educational Achievement.

New Education Data from the Household Pulse Survey

Recognizing the extraordinary information needs of policymakers during the coronavirus pandemic, NCES joined a partnership with the Census Bureau and four other federal statistical agencies to quickly develop a survey to gather key indicators of our nation’s response to the global pandemic. Thus, the experimental 2020 Household Pulse Survey began development on March 23, 2020, and data collection began on April 23, 2020. This new survey provides weekly national and state estimates, which are released to the public in tabular formats one week after the end of data collection.

The Household Pulse Survey gathers information from adults about employment status, spending patterns, food security, housing, physical and mental health, access to health care, and educational disruption. The education component includes questions about the following:

  • The weekly time spent on educational activities by students in public and private elementary and secondary schools
  • The availability of computer equipment and the Internet for instructional purposes
  • The extent to which computer equipment and the Internet for students were provided or subsidized

Since this survey is designed to represent adults 18 years old and over, the responses to the education questions concern students within the households of adults 18 years old and over, not the percentage of students themselves.

In the Household Pulse Survey during the weeks of April 23 through May 5, adults reported that their average weekly time spent on teaching activities with elementary and secondary students in their household was 13.1 hours. These results differed by educational attainment: adults who had not completed high school reported a weekly average of 9.9 hours in teaching activities with children, whereas adults with a bachelor’s or higher degree reported 13.9 hours (figure 1). In terms of the average weekly time spent on live virtual contact between students in their household and their teachers, adults reported a lower average of 4.1 hours.



Adults’ reports about the school instruction model need to be interpreted carefully because respondents could choose multiple types of approaches. A higher percentage of adults with a bachelor’s or higher degree (84 percent) reported that classes for elementary and secondary students in their household had moved to a format using online resources than did adults who had completed some college or an associate’s degree (74 percent), adults who had completed only high school (64 percent), or adults who had not completed high school (57 percent).

Higher percentages of adults with higher levels of education than of adults with lower levels of education reported that computers and the Internet were always available for educational purposes for elementary and secondary students in their households (figure 2).



The percentage of adults who reported that the school district provided a computer or digital device for children in their households to use at home for educational purposes was higher for adults who had not completed high school (44 percent) than for adults with a bachelor’s or higher degree (33 percent). Also, a higher percentage of adults who had not completed high school than of adults with higher levels of educational attainment reported financial assistance for student Internet access.

It is important to note that the speed of the survey development and the pace of the data collection efforts have led to policies and procedures for the experimental Household Pulse Survey that are not always consistent with traditional federal survey operations. Data should be interpreted with proper caution.  

More information on the Household Pulse Survey, detailed statistical tables, and microdata sets are available at https://www.census.gov/householdpulsedata. The Household Pulse Survey site includes breakouts of the data by other characteristics, such as race/ethnicity. In addition to participating in the development of this new survey, NCES has also generated new analyses based on existing data that respond to new needs for policy information, such as the availability of the Internet for student learning.

 

By Xiaolei Wang, AIR

An Evidence-Based Response to COVID-19: What We’re Learning

Several weeks ago, I announced the What Works Clearinghouse’s™ first ever rapid evidence synthesis project: a quick look at “what works” in distance education. I asked families and educators to send us their questions about how to adapt to learning at home, from early childhood to adult basic education. I posed a different challenge to researchers and technologists, asking them to nominate high-quality studies of distance and on-line learning that could begin to answer those questions.

Between public nominations and our own databases, we’ve now surfaced more than 900 studies. I was happy to see the full-text of about 300 studies were already available in ERIC, our own bibliographic database—and that many submitters whose work isn’t yet found there pledged to submit to ERIC, making sure it will be freely available to the public in the future. I was a little less happy to learn that only a few dozen of those 900 had already been reviewed by the WWC. This could mean either that (1) there is not a lot of rigorous research on distance learning, or (2) rigorous research exists, but we are systematically missing it. The truth is probably “both-and,” not “either-or.” Rigorous research exists, but more is needed … and the WWC needs to be more planful in capturing it.

The next step for the WWC team is to screen nominated studies to see which are likely to meet our evidence standards. As I’ve said elsewhere, we’ll be lucky if a small fraction—maybe 50—do. Full WWC reviews of the most actionable studies among them will be posted to the WWC website by June 1st, and at that time it is my hope that meta-analysts and technical assistance providers from across the country pitch in to create the products teachers and families desperately need. (Are you a researcher or content producer who wants to join that effort? If so, email me at matthew.soldner@ed.gov.)

Whether this approach actually works is an open question. Will it reduce the time it takes to create products that are both useful and used? All told, our time on the effort will amount to about two months. I had begun this process hoping for something even quicker. My early thinking was that IES would only put out a call for studies, leaving study reviews and product development to individual research teams. My team was convinced, however, that the value of a full WWC review for studies outweighed the potential benefit of quicker products. They were, of course, correct: IES’ comparative advantage stems from our commitment to quality and rigor.

I am willing to stipulate that these are unusual times: the WWC’s evidence synthesis infrastructure hasn’t typically needed to turn on a dime, and I hope that continues to be the case. That said, there may be lessons to be learned from this moment, about both how the WWC does its own work and how it supports the work of the field. To that end, I’d offer a few thoughts.

The WWC could support partners in research and content creation who can act nimbly, maintaining pressure for rigorous work.

Educators have questions that span every facet of their work, every subject, and every age band. And there’s a lot of education research out there, from complex, multi-site RCTs to small, qualitative case studies. The WWC doesn’t have the capacity to either answer every question that deserves answering or synthesize every study we’re interested in synthesizing. (Not to mention the many types of studies we don’t have good methods for synthesizing today.)

This suggests to me there is a potential market for researchers and technical assistance providers who can quickly identify high-quality evidence, accurately synthesize it, and create educator-facing materials that can make a difference in classroom practice. Some folks have begun to fill the gap, including both familiar faces and not-so-familiar ones. Opportunities for collaboration abound, and partners like these can be sources of inspiration and innovation for one another and for the WWC. Where there are gaps in our understanding of how to do this work well that can be filled through systematic inquiry, IES can offer financial support via our Statistical and Research Methodology in Education grant program.   

The WWC could consider adding new products to its mix, including rigorous rapid evidence syntheses.

Anyone who has visited us at whatworks.ed.gov recently knows the WWC offers two types of syntheses: Intervention Reports and Practice Guides. Neither are meant to be quick-turnaround products.

As their name implies, Intervention Reports are systematic reviews of a single, typically brand-name, intervention. They are fairly short, no longer than 15 pages. And they don’t take too long to produce, since they’re focused on a single product. Despite having done nearly 600 of them, we often hear we haven’t reviewed the specific product a stakeholder reports needing information on. Similarly, we often hear from stakeholders that they aren’t in a position to buy a product. Instead, they’re looking for the “secret sauce” they could use in their state, district, building, or classroom.

Practice Guides are our effort to identify generalizable practices across programs and products that can make a difference in student outcomes. Educators download our most popular Guides tens of thousands of times a year, and they are easily the best thing we create. But it is fair to say they are labors of love. Each Guide is the product of the hard work of researchers, practitioners, and other subject matter experts over about 18 months.  

Something seems to be missing from our product mix. What could the WWC produce that is as useful as a Practice Guide but as lean as an Intervention Report? 

Our very wise colleagues at the UK’s Education Endowment Foundation have a model that is potentially promising: Rapid Evidence Assessments based on pre-existing meta-analyses. I am particularly excited about their work because—despite not coordinating our efforts—they are also focusing on Distance Learning and released a rapid assessment on the topic on April 22nd. There are plusses and minuses to their approach, and they do not share our requirement for rigorous peer review. But there is certainly something to be learned from how they do their work.

The WWC could expand its “what works” remit to include “what’s innovative,” adding forward-looking horizon scanning to here-and-now (and sometimes yesterday) meta-analysis.

Meta-analyses play a critical role in efforts to bring evidence to persistent problems of practice, helping to sort through multiple, sometimes conflicting studies to yield a robust estimate of whether an intervention works. The inputs to any meta-analysis are what is already known—or at least what has already been published—about programs, practices, and policies. They are therefore backward-looking by design. Given how slowly most things change in education, that is typically fine.

But what help is meta-analysis when a problem is novel, or when the best solution isn’t a well-studied intervention but instead a new innovation? In these cases, practitioners are craving evidence before it has been synthesized and, sometimes, before it has even been generated. Present experience demonstrates that any of us can be made to grasp for anything that even smacks of evidence, if the circumstances are precarious enough. The challenge to an organization like the WWC, which relies on traditional conceptions of rigorous evidence of efficacy and effectiveness, is a serious one.

How might the WWC become aware of potentially promising solutions to today’s problems before much if anything is known about their efficacy, and how might we surface those problems that are nascent today but could explode across the landscape tomorrow? 

One model I’m intensely interested in is the Health Care Horizon Scanning System at PCORI. In their words, it “provides a systematic process to identify healthcare interventions that have a high potential to alter the standard of care.” Adapted to the WWC use case, this sort of system would alert us to novel solutions: practices that merited monitoring and might cause us to build and/or share early evidence broadly to relevant stakeholders. This same approach could surface innovations designed to solve novel problems that weren’t already the subject of multiple research efforts and well-represented in the literature. We’d be ahead of—or at least tracking alongside—the curve, not behind.  

Wrapping Up

The WWC’s current Rapid Evidence Synthesis focused on distance learning is an experiment of sorts. It represents a new way of interacting with our key stakeholders, a new way to gather evidence, and a new way to see our reviews synthesized into products that can improve practice. To the extent that it has pushed us to try new models and has identified hundreds of “new” (or “new to us”) studies, it is already a success. Of course, we still hope for more.

As I hope you can see from this blog, it has also spurred us to consider other ways we can further strengthen an already strong program. I welcome your thoughts and feedback – just email me at matthew.soldner@ed.gov.