IES Blog

Institute of Education Sciences

Teaching with Technology: U.S. Teachers’ Perceptions and Use of Digital Technology in an International Context

The coronavirus pandemic forced teachers across the world to immediately transition instruction to a virtual setting in early 2020. To understand U.S. teachers’ level of preparedness for this shift in an international context, this blog examines recent international data from U.S. teachers’ responses to questions on the following topics:

  • Their perceptions of information and communications technologies (ICT) resources
  • Their use of ICT for instruction prior to the pandemic

In general, the results suggest that U.S. teachers are more resourced in ICT than their international peers, and they use ICT at a similar frequency at school when teaching.

 

Teachers’ perceptions of ICT resources at their school

The quantity and quality of ICT resources available in school systems prior to the coronavirus pandemic may impact teachers’ access to such resources for instructional purposes while classrooms are functioning in a virtual format. The United States participated in the 2018 International Computer and Information Literacy Study (ICILS), which asked questions about ICT resources to a nationally representative sample of eighth-grade teachers from 14 education systems.

The results from this study show that 86 percent of eighth-grade teachers both in the United States and across ICILS 2018 education systems “strongly agreed” or “agreed” that ICT is considered a priority for use in teaching (figure 1). Compared with the ICILS 2018
averages,[1] higher percentages of U.S. eighth-grade teachers “strongly agreed” or “agreed” with various statements about the use of ICT.

While 86 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “ICT is considered a priority for use in teaching,” only 61 percent “strongly agreed” or “agreed” that “there is sufficient opportunity for me to develop expertise in ICT” (figure 1). Additionally, 62 percent of U.S. eighth-grade teachers “strongly agreed” or “agreed” that “there is enough time to prepare lessons that incorporate ICT.” These disparities may have had an impact on teacher capacity during the sudden shift to 100 percent online learning as a result of the coronavirus pandemic, which would be a good topic for future research and analyses.  


Figure 1. Percentage of eighth-grade teachers who reported that they “strongly agree” or “agree” with statements about using ICT in teaching at school, by statement: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.

¹ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
² National Defined Population covers 90 to 95 percent of National Target Population.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Statements are ordered by the percentages of U.S. teachers reporting “strongly agree” or “agree” from largest to smallest.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 17 from U.S. Results from the 2018 ICILS Web Report.


Teachers’ perceptions of the use of ICT for instruction

Teachers’ views on the role of ICT in virtual instruction during the coronavirus pandemic are not yet clear. However, in 2018, when instruction was conducted in physical classrooms, most U.S. eighth-grade teachers participating in ICILS expressed positive perceptions about “using ICT in teaching and learning at school,” as did many teachers internationally.

Among eighth-grade teachers in the United States, 95 percent agreed that ICT “enables students to access better sources of information,” 92 percent agreed that ICT “helps students develop greater interest in learning,” and 92 percent agreed that ICT “helps students work at a level appropriate to their learning needs.” On average across other education systems participating in ICILS, at least 85 percent of teachers agreed with each of these statements (Fraillon et al. 2019).

Seventy-five percent of U.S. eighth-grade teachers in 2018 agreed that ICT “improves academic performance of students,” which was higher than the ICILS international average of 71 percent. The percentages of teachers who agreed with this statement varied across education systems, from three-quarters or more of teachers in Chile, Denmark, Kazakhstan, and Portugal to less than half of teachers in Finland and North Rhine-Westphalia (Germany).

 

Frequency of teachers’ use of ICT

Teachers’ reported use of ICT for instruction in physical classroom settings may provide insight into their level of experience as they transition to virtual settings during the coronavirus pandemic.

In 2018, half of U.S. eighth-grade teachers reported “using ICT at school when teaching” every day, which was not significantly different from the ICILS average of 48 percent. However, the U.S. percentage was lower than the percentages of teachers in Moscow (76 percent), Denmark (72 percent), and Finland (57 percent) (figure 2).


Figure 2. Percentage of eighth-grade teachers who reported using ICT at school every day when teaching, by education system: 2018

p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.
¹ Met guidelines for sample participation rates only after replacement schools were included.
² National Defined Population covers 90 to 95 percent of National Target Population.
³ Did not meet the guidelines for a sample participation rate of 85 percent and not included in the international average.
⁴ Data collected at the beginning of the school year.
NOTE: ICT = information and communications technologies. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their percentages of teachers reporting using ICT at school when teaching from largest to smallest. Italics indicate the benchmarking participants.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), The International Computer and Information Literacy Study (ICILS), 2018. Modified reproduction of figure 15 from U.S. Results from the 2018 ICILS Web Report.


For more information on teachers and technology, check out NCES’s ICILS 2018 website, the international ICILS website, and the earlier NCES blog “New Study on U.S. Eighth-Grade Students’ Computer Literacy.”

 

By Amy Rathbun, AIR, and Stephen Provasnik, NCES

 


[1] The ICILS average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. The United States did not meet the guidelines for a sample participation rate of 85 percent, so it is not included in the international average.

 

Reference

Fraillon, J., Ainley, J., Schulz, W., Friedman, T., and Duckworth, D. (2019). Preparing for Life in a Digital World: IEA International Computer and Information Literacy Study 2018 International Report. Amsterdam: International Association for the Evaluation of Educational Achievement.

New Education Data from the Household Pulse Survey

Recognizing the extraordinary information needs of policymakers during the coronavirus pandemic, NCES joined a partnership with the Census Bureau and four other federal statistical agencies to quickly develop a survey to gather key indicators of our nation’s response to the global pandemic. Thus, the experimental 2020 Household Pulse Survey began development on March 23, 2020, and data collection began on April 23, 2020. This new survey provides weekly national and state estimates, which are released to the public in tabular formats one week after the end of data collection.

The Household Pulse Survey gathers information from adults about employment status, spending patterns, food security, housing, physical and mental health, access to health care, and educational disruption. The education component includes questions about the following:

  • The weekly time spent on educational activities by students in public and private elementary and secondary schools
  • The availability of computer equipment and the Internet for instructional purposes
  • The extent to which computer equipment and the Internet for students were provided or subsidized

Since this survey is designed to represent adults 18 years old and over, the responses to the education questions concern students within the households of adults 18 years old and over, not the percentage of students themselves.

In the Household Pulse Survey during the weeks of April 23 through May 5, adults reported that their average weekly time spent on teaching activities with elementary and secondary students in their household was 13.1 hours. These results differed by educational attainment: adults who had not completed high school reported a weekly average of 9.9 hours in teaching activities with children, whereas adults with a bachelor’s or higher degree reported 13.9 hours (figure 1). In terms of the average weekly time spent on live virtual contact between students in their household and their teachers, adults reported a lower average of 4.1 hours.



Adults’ reports about the school instruction model need to be interpreted carefully because respondents could choose multiple types of approaches. A higher percentage of adults with a bachelor’s or higher degree (84 percent) reported that classes for elementary and secondary students in their household had moved to a format using online resources than did adults who had completed some college or an associate’s degree (74 percent), adults who had completed only high school (64 percent), or adults who had not completed high school (57 percent).

Higher percentages of adults with higher levels of education than of adults with lower levels of education reported that computers and the Internet were always available for educational purposes for elementary and secondary students in their households (figure 2).



The percentage of adults who reported that the school district provided a computer or digital device for children in their households to use at home for educational purposes was higher for adults who had not completed high school (44 percent) than for adults with a bachelor’s or higher degree (33 percent). Also, a higher percentage of adults who had not completed high school than of adults with higher levels of educational attainment reported financial assistance for student Internet access.

It is important to note that the speed of the survey development and the pace of the data collection efforts have led to policies and procedures for the experimental Household Pulse Survey that are not always consistent with traditional federal survey operations. Data should be interpreted with proper caution.  

More information on the Household Pulse Survey, detailed statistical tables, and microdata sets are available at https://www.census.gov/householdpulsedata. The Household Pulse Survey site includes breakouts of the data by other characteristics, such as race/ethnicity. In addition to participating in the development of this new survey, NCES has also generated new analyses based on existing data that respond to new needs for policy information, such as the availability of the Internet for student learning.

 

By Xiaolei Wang, AIR

An Evidence-Based Response to COVID-19: What We’re Learning

Several weeks ago, I announced the What Works Clearinghouse’s™ first ever rapid evidence synthesis project: a quick look at “what works” in distance education. I asked families and educators to send us their questions about how to adapt to learning at home, from early childhood to adult basic education. I posed a different challenge to researchers and technologists, asking them to nominate high-quality studies of distance and on-line learning that could begin to answer those questions.

Between public nominations and our own databases, we’ve now surfaced more than 900 studies. I was happy to see the full-text of about 300 studies were already available in ERIC, our own bibliographic database—and that many submitters whose work isn’t yet found there pledged to submit to ERIC, making sure it will be freely available to the public in the future. I was a little less happy to learn that only a few dozen of those 900 had already been reviewed by the WWC. This could mean either that (1) there is not a lot of rigorous research on distance learning, or (2) rigorous research exists, but we are systematically missing it. The truth is probably “both-and,” not “either-or.” Rigorous research exists, but more is needed … and the WWC needs to be more planful in capturing it.

The next step for the WWC team is to screen nominated studies to see which are likely to meet our evidence standards. As I’ve said elsewhere, we’ll be lucky if a small fraction—maybe 50—do. Full WWC reviews of the most actionable studies among them will be posted to the WWC website by June 1st, and at that time it is my hope that meta-analysts and technical assistance providers from across the country pitch in to create the products teachers and families desperately need. (Are you a researcher or content producer who wants to join that effort? If so, email me at matthew.soldner@ed.gov.)

Whether this approach actually works is an open question. Will it reduce the time it takes to create products that are both useful and used? All told, our time on the effort will amount to about two months. I had begun this process hoping for something even quicker. My early thinking was that IES would only put out a call for studies, leaving study reviews and product development to individual research teams. My team was convinced, however, that the value of a full WWC review for studies outweighed the potential benefit of quicker products. They were, of course, correct: IES’ comparative advantage stems from our commitment to quality and rigor.

I am willing to stipulate that these are unusual times: the WWC’s evidence synthesis infrastructure hasn’t typically needed to turn on a dime, and I hope that continues to be the case. That said, there may be lessons to be learned from this moment, about both how the WWC does its own work and how it supports the work of the field. To that end, I’d offer a few thoughts.

The WWC could support partners in research and content creation who can act nimbly, maintaining pressure for rigorous work.

Educators have questions that span every facet of their work, every subject, and every age band. And there’s a lot of education research out there, from complex, multi-site RCTs to small, qualitative case studies. The WWC doesn’t have the capacity to either answer every question that deserves answering or synthesize every study we’re interested in synthesizing. (Not to mention the many types of studies we don’t have good methods for synthesizing today.)

This suggests to me there is a potential market for researchers and technical assistance providers who can quickly identify high-quality evidence, accurately synthesize it, and create educator-facing materials that can make a difference in classroom practice. Some folks have begun to fill the gap, including both familiar faces and not-so-familiar ones. Opportunities for collaboration abound, and partners like these can be sources of inspiration and innovation for one another and for the WWC. Where there are gaps in our understanding of how to do this work well that can be filled through systematic inquiry, IES can offer financial support via our Statistical and Research Methodology in Education grant program.   

The WWC could consider adding new products to its mix, including rigorous rapid evidence syntheses.

Anyone who has visited us at whatworks.ed.gov recently knows the WWC offers two types of syntheses: Intervention Reports and Practice Guides. Neither are meant to be quick-turnaround products.

As their name implies, Intervention Reports are systematic reviews of a single, typically brand-name, intervention. They are fairly short, no longer than 15 pages. And they don’t take too long to produce, since they’re focused on a single product. Despite having done nearly 600 of them, we often hear we haven’t reviewed the specific product a stakeholder reports needing information on. Similarly, we often hear from stakeholders that they aren’t in a position to buy a product. Instead, they’re looking for the “secret sauce” they could use in their state, district, building, or classroom.

Practice Guides are our effort to identify generalizable practices across programs and products that can make a difference in student outcomes. Educators download our most popular Guides tens of thousands of times a year, and they are easily the best thing we create. But it is fair to say they are labors of love. Each Guide is the product of the hard work of researchers, practitioners, and other subject matter experts over about 18 months.  

Something seems to be missing from our product mix. What could the WWC produce that is as useful as a Practice Guide but as lean as an Intervention Report? 

Our very wise colleagues at the UK’s Education Endowment Foundation have a model that is potentially promising: Rapid Evidence Assessments based on pre-existing meta-analyses. I am particularly excited about their work because—despite not coordinating our efforts—they are also focusing on Distance Learning and released a rapid assessment on the topic on April 22nd. There are plusses and minuses to their approach, and they do not share our requirement for rigorous peer review. But there is certainly something to be learned from how they do their work.

The WWC could expand its “what works” remit to include “what’s innovative,” adding forward-looking horizon scanning to here-and-now (and sometimes yesterday) meta-analysis.

Meta-analyses play a critical role in efforts to bring evidence to persistent problems of practice, helping to sort through multiple, sometimes conflicting studies to yield a robust estimate of whether an intervention works. The inputs to any meta-analysis are what is already known—or at least what has already been published—about programs, practices, and policies. They are therefore backward-looking by design. Given how slowly most things change in education, that is typically fine.

But what help is meta-analysis when a problem is novel, or when the best solution isn’t a well-studied intervention but instead a new innovation? In these cases, practitioners are craving evidence before it has been synthesized and, sometimes, before it has even been generated. Present experience demonstrates that any of us can be made to grasp for anything that even smacks of evidence, if the circumstances are precarious enough. The challenge to an organization like the WWC, which relies on traditional conceptions of rigorous evidence of efficacy and effectiveness, is a serious one.

How might the WWC become aware of potentially promising solutions to today’s problems before much if anything is known about their efficacy, and how might we surface those problems that are nascent today but could explode across the landscape tomorrow? 

One model I’m intensely interested in is the Health Care Horizon Scanning System at PCORI. In their words, it “provides a systematic process to identify healthcare interventions that have a high potential to alter the standard of care.” Adapted to the WWC use case, this sort of system would alert us to novel solutions: practices that merited monitoring and might cause us to build and/or share early evidence broadly to relevant stakeholders. This same approach could surface innovations designed to solve novel problems that weren’t already the subject of multiple research efforts and well-represented in the literature. We’d be ahead of—or at least tracking alongside—the curve, not behind.  

Wrapping Up

The WWC’s current Rapid Evidence Synthesis focused on distance learning is an experiment of sorts. It represents a new way of interacting with our key stakeholders, a new way to gather evidence, and a new way to see our reviews synthesized into products that can improve practice. To the extent that it has pushed us to try new models and has identified hundreds of “new” (or “new to us”) studies, it is already a success. Of course, we still hope for more.

As I hope you can see from this blog, it has also spurred us to consider other ways we can further strengthen an already strong program. I welcome your thoughts and feedback – just email me at matthew.soldner@ed.gov.

Seeking Your Help in Learning More About What Works in Distance Education: A Rapid Evidence Synthesis

Note: NCEE will continue to accept study nominations after the April 3rd deadline, adding them on a regular basis to our growing bibliography found here. Studies received before the deadline will be considered for the June 1 data release. NCEE will use studies received after the deadline to inform our prioritization of studies for review. Awareness of these studies will also allow NCEE to consider them for future activities related to distance and/or online education and remote learning.

In the midst of the coronavirus crisis, we know that families and educators are scrambling for high-quality information about what works in distance education—a term we use here to include both online learning as well as opportunities for students to use technology or other resources to learn while not physically at school.

Leaders in the education technology ecosystem have already begun to respond to the COVID-19 outbreak by creating websites like techforlearners.org, which as of today lists more than 400 online learning products, resources, and services. But too little information is widely available about what works in distance education to improve student outcomes.

If ever there is a time for citizen science, it is now. Starting today, the What Works Clearinghouse™ (WWC) at the U.S. Department of Education’s Institute of Education Sciences is announcing its first-ever cooperative rapid evidence synthesis.

Here is what we have in mind:

  • Between now and April 3rd, we are asking families and educators to share with us questions they have about effective distance education practices and products. We are particularly interested in questions about practices that seem especially relevant today, in which educators are called to adapt their instruction to online formats or send learning materials home to students, and families, not all of whom have internet access, seek to combine available technology with other resources to create a coherent learning experience for their students. Early education, elementary, postsecondary, and adult basic education practices and products are welcome. Submit all nominations to NCEE.Feedback@ed.gov.
  • During that same time, we are asking that members of the public, including researchers and technologists, nominate any rigorous research they are aware of or have conducted that evaluates the effectiveness of specific distance education practices or products on student outcomes. As above,  education, elementary, postsecondary, and adult basic education practices and products are welcome.
    • Submit all nominations to NCEE.Feedback@ed.gov. Nominations should include links to publicly available versions of studies wherever possible.
    • Study authors are strongly encouraged to nominate studies as described above and simultaneously submit them to ED’s online repository of education research, ERIC. Learn more about the ERIC submission process here.
    • We will post a link to a list of studies on this page and update it on a regular basis.
       
  • By June 1, certified WWC reviewers will have prioritized and screened as many nominated studies as resources allow. Based on the responses received from families, educators, researchers, and technologists, we may narrow the focus of our review; however, nominations will be posted to our website, even those we do not review. Reviews will be entered in the WWC’s Review of Individual Studies Database, which can be downloaded as a flat file.
     
  • After June 1, individual meta-analysts, research teams, or others can download screened studies from the WWC and begin their meta-analytic work. As researchers complete their syntheses, they should submit them through the ERIC online submission system and alert IES. Although we cannot review each analysis or endorse their findings, we will do our best to announce each new review via social media—amplifying your work to educators, families, and other interested stakeholders. Let me know at NCEE.Feedback@ed.gov if this part of the work is of interest to you or your colleagues.

Will you help, joining the WWC’s effort to generate high-quality information about what works in distance education? If so, submit your study today, let me know you or your team are interested in lending your meta-analytic skills to the effort, or just provide feedback on how to make this work more effectively. You can reach me directly at matthew.soldner@ed.gov.

Matthew Soldner

Commissioner, National Center for Education Evaluation and
Agency Evaluation Officer, U.S. Department of Education

IES at the Conference on Computing and Sustainable Societies

Over the summer, researchers, technologists, and policymakers gathered in Accra, Ghana for the Association for Computing Machinery’s Conference on Computing and Sustainable Societies (ACM COMPASS) to discuss the role of information technologies in international development.

Two IES-funded researchers from Carnegie Mellon University’s Program in Interdisciplinary Education Research, Michael Madaio and Dr. Amy Ogan, shared their research on developing voice-based early literacy technologies and evaluating their efficacy with low-literate, bilingual families in the Ivory Coast. 

Their research draws on methods from human-computer interaction, the learning sciences, and information-communication technology for development, to design educational technologies that are culturally and contextually appropriate.

Although the COMPASS conference focused on cross-cultural applications and technology for development, the research presented has implications for U.S. based education researchers, practitioners, and policymakers.

For instance, while research provides evidence for the importance of parental involvement in early literacy, parents with low literacy in the target language – as in many bilingual immigrant communities in the U.S. – may not be able to support their children with the explicit, instrumental help suggested by prior research (for example, letter naming or bookreading). This suggests that there may be opportunities for technology to scaffold low-literate or English Learners (EL) parental support in other ways.

At the conference, researchers described interactive voice-based systems (known as “IVR”) that help low-literate users find out about crop yields, understand local government policies, and engage on social media.  

This body of work has implications for designers of learning technologies in the U.S. Many families may not have a smartphone, but basic feature phones are ubiquitous worldwide, including in low-income, immigrant communities in the U.S. Thus, designers of learning technologies may consider designing SMS- or voice-based (such as IVR) systems, while schools or school districts may consider how to use voice-based systems to engage low-literate or EL families who may not have a smartphone or who may not be able to read SMS information messages.

In a rapidly changing, increasingly globalized world, research at IES may benefit from increased international engagement with international research, both focusing specifically on education, as well as information technology research that has implications for educational research, practice, and policy.

This guest blog was written by Michael Madaio. He is an IES Predoctoral Fellow in the Program in Interdisciplinary Education Research at Carnegie Mellon University. He is placed in the Human-Computer Interaction Institute.