IES Blog

Institute of Education Sciences

An Evidence-Based Response to COVID-19: What We’re Learning

Several weeks ago, I announced the What Works Clearinghouse’s™ first ever rapid evidence synthesis project: a quick look at “what works” in distance education. I asked families and educators to send us their questions about how to adapt to learning at home, from early childhood to adult basic education. I posed a different challenge to researchers and technologists, asking them to nominate high-quality studies of distance and on-line learning that could begin to answer those questions.

Between public nominations and our own databases, we’ve now surfaced more than 900 studies. I was happy to see the full-text of about 300 studies were already available in ERIC, our own bibliographic database—and that many submitters whose work isn’t yet found there pledged to submit to ERIC, making sure it will be freely available to the public in the future. I was a little less happy to learn that only a few dozen of those 900 had already been reviewed by the WWC. This could mean either that (1) there is not a lot of rigorous research on distance learning, or (2) rigorous research exists, but we are systematically missing it. The truth is probably “both-and,” not “either-or.” Rigorous research exists, but more is needed … and the WWC needs to be more planful in capturing it.

The next step for the WWC team is to screen nominated studies to see which are likely to meet our evidence standards. As I’ve said elsewhere, we’ll be lucky if a small fraction—maybe 50—do. Full WWC reviews of the most actionable studies among them will be posted to the WWC website by June 1st, and at that time it is my hope that meta-analysts and technical assistance providers from across the country pitch in to create the products teachers and families desperately need. (Are you a researcher or content producer who wants to join that effort? If so, email me at matthew.soldner@ed.gov.)

Whether this approach actually works is an open question. Will it reduce the time it takes to create products that are both useful and used? All told, our time on the effort will amount to about two months. I had begun this process hoping for something even quicker. My early thinking was that IES would only put out a call for studies, leaving study reviews and product development to individual research teams. My team was convinced, however, that the value of a full WWC review for studies outweighed the potential benefit of quicker products. They were, of course, correct: IES’ comparative advantage stems from our commitment to quality and rigor.

I am willing to stipulate that these are unusual times: the WWC’s evidence synthesis infrastructure hasn’t typically needed to turn on a dime, and I hope that continues to be the case. That said, there may be lessons to be learned from this moment, about both how the WWC does its own work and how it supports the work of the field. To that end, I’d offer a few thoughts.

The WWC could support partners in research and content creation who can act nimbly, maintaining pressure for rigorous work.

Educators have questions that span every facet of their work, every subject, and every age band. And there’s a lot of education research out there, from complex, multi-site RCTs to small, qualitative case studies. The WWC doesn’t have the capacity to either answer every question that deserves answering or synthesize every study we’re interested in synthesizing. (Not to mention the many types of studies we don’t have good methods for synthesizing today.)

This suggests to me there is a potential market for researchers and technical assistance providers who can quickly identify high-quality evidence, accurately synthesize it, and create educator-facing materials that can make a difference in classroom practice. Some folks have begun to fill the gap, including both familiar faces and not-so-familiar ones. Opportunities for collaboration abound, and partners like these can be sources of inspiration and innovation for one another and for the WWC. Where there are gaps in our understanding of how to do this work well that can be filled through systematic inquiry, IES can offer financial support via our Statistical and Research Methodology in Education grant program.   

The WWC could consider adding new products to its mix, including rigorous rapid evidence syntheses.

Anyone who has visited us at whatworks.ed.gov recently knows the WWC offers two types of syntheses: Intervention Reports and Practice Guides. Neither are meant to be quick-turnaround products.

As their name implies, Intervention Reports are systematic reviews of a single, typically brand-name, intervention. They are fairly short, no longer than 15 pages. And they don’t take too long to produce, since they’re focused on a single product. Despite having done nearly 600 of them, we often hear we haven’t reviewed the specific product a stakeholder reports needing information on. Similarly, we often hear from stakeholders that they aren’t in a position to buy a product. Instead, they’re looking for the “secret sauce” they could use in their state, district, building, or classroom.

Practice Guides are our effort to identify generalizable practices across programs and products that can make a difference in student outcomes. Educators download our most popular Guides tens of thousands of times a year, and they are easily the best thing we create. But it is fair to say they are labors of love. Each Guide is the product of the hard work of researchers, practitioners, and other subject matter experts over about 18 months.  

Something seems to be missing from our product mix. What could the WWC produce that is as useful as a Practice Guide but as lean as an Intervention Report? 

Our very wise colleagues at the UK’s Education Endowment Foundation have a model that is potentially promising: Rapid Evidence Assessments based on pre-existing meta-analyses. I am particularly excited about their work because—despite not coordinating our efforts—they are also focusing on Distance Learning and released a rapid assessment on the topic on April 22nd. There are plusses and minuses to their approach, and they do not share our requirement for rigorous peer review. But there is certainly something to be learned from how they do their work.

The WWC could expand its “what works” remit to include “what’s innovative,” adding forward-looking horizon scanning to here-and-now (and sometimes yesterday) meta-analysis.

Meta-analyses play a critical role in efforts to bring evidence to persistent problems of practice, helping to sort through multiple, sometimes conflicting studies to yield a robust estimate of whether an intervention works. The inputs to any meta-analysis are what is already known—or at least what has already been published—about programs, practices, and policies. They are therefore backward-looking by design. Given how slowly most things change in education, that is typically fine.

But what help is meta-analysis when a problem is novel, or when the best solution isn’t a well-studied intervention but instead a new innovation? In these cases, practitioners are craving evidence before it has been synthesized and, sometimes, before it has even been generated. Present experience demonstrates that any of us can be made to grasp for anything that even smacks of evidence, if the circumstances are precarious enough. The challenge to an organization like the WWC, which relies on traditional conceptions of rigorous evidence of efficacy and effectiveness, is a serious one.

How might the WWC become aware of potentially promising solutions to today’s problems before much if anything is known about their efficacy, and how might we surface those problems that are nascent today but could explode across the landscape tomorrow? 

One model I’m intensely interested in is the Health Care Horizon Scanning System at PCORI. In their words, it “provides a systematic process to identify healthcare interventions that have a high potential to alter the standard of care.” Adapted to the WWC use case, this sort of system would alert us to novel solutions: practices that merited monitoring and might cause us to build and/or share early evidence broadly to relevant stakeholders. This same approach could surface innovations designed to solve novel problems that weren’t already the subject of multiple research efforts and well-represented in the literature. We’d be ahead of—or at least tracking alongside—the curve, not behind.  

Wrapping Up

The WWC’s current Rapid Evidence Synthesis focused on distance learning is an experiment of sorts. It represents a new way of interacting with our key stakeholders, a new way to gather evidence, and a new way to see our reviews synthesized into products that can improve practice. To the extent that it has pushed us to try new models and has identified hundreds of “new” (or “new to us”) studies, it is already a success. Of course, we still hope for more.

As I hope you can see from this blog, it has also spurred us to consider other ways we can further strengthen an already strong program. I welcome your thoughts and feedback – just email me at matthew.soldner@ed.gov.

New Remote Learning Resources from the REL Program

In response to COVID-19, the 10 Regional Educational Laboratories (RELs) have collaborated to produce a series of evidence-based resources and guidance about teaching and learning in a remote environment, as well as other considerations related to the pandemic. See below for a roundup of upcoming REL events and recently published resources on this topic.

 

Upcoming Webinars

Friday, April 24: Refining Your Distance Learning Strategies Using a Data-Driven Approach: The Evidence to Insights Coach: REL Mid-Atlantic will discuss a free tool that districts and schools can use to test and identify—in real time—which online learning approaches work best for their own students. The webinar will discuss what you’ll need to make the tool work for you and how you can be strategic about using existing data.
Audience: State leaders, district leaders, school boards, school leaders

Wednesday, April 29: Strategies for Districts to Support Self-Care for Educators During the COVID-19 Pandemic: REL West, the Region 15 Comprehensive Center, and the National Center to Improve Social & Emotional Learning and School Safety, will offer practical information and guidance backed by research to help school staff cope with the stresses of school closures, service provision, and quarantine due to the COVID-19 pandemic.
Audience: District leaders

New Resources

How can Districts Promote a Safe and Secure Digital Learning Environment?
FAQ | REL West
Audience: District leaders, school leaders, teachers

Research-Based Resources, Considerations, and Strategies for Remote Learning
Webinar recording | REL Midwest
Audience: School leaders, instructional coaches, teachers

Resources for Schools and Districts Responding to the COVID-19 Crisis
Topics covered on this page include providing high-quality instruction to English learners in an online environment, engaging families to support student learning, alternative approaches to graduation ceremonies, and more.
FAQs | REL Northeast & Islands
Audience: District leaders, school leaders, teachers

Strategies to Support Learning Along a Continuum of Internet Access [PDF]
Fact sheet | REL Central
Audience: District leaders, school leaders

Supporting Your Child’s Reading at Home: Kindergarten and First Grade
Videos and activities | REL Southeast
Audience: Families, caregivers

Using Transparency to Create Accountability When School Buildings Are Closed and Tests Are Cancelled
Blog | REL Mid-Atlantic
Audience: District leaders, school leaders

Responding to COVID19 in Education: ED/IES and Government Supported Developers Offer Virtual Resources and Activities for Distance Learning

We recently posted this blog listing more than 80 learning games and technologies that are available at no cost until the end of the school year in response to the closure of schools due to the COVID crisis. The resources were created by education technology developers with support from the Small Business Innovation Research (SBIR) Programs at ED/IES and other agencies, as well as through programs at IES and across government. In recent weeks, more than 100,000 teachers and students around the country have accessed these learning technologies at a distance.

Today, we are sharing more resources and activities that this group of developers is making available to the education community in response to COVID19.

A Series of Day-Long Virtual Unconferences

Over the coming weeks, developers are hosting a series of free virtual “Unconferences” on different topics for educators, parents, and students. The events will feature innovative models and approaches to teaching and learning during this time of distance learning and in-depth looks at the learning games and technologies created by the presenters, available at no cost until the end of the school year. While presenters will describe the delivery of online interventions via computers and devices, sessions will also focus on innovative approaches to implementing the interventions in low-resource settings.

The events are called “Unconferences” because the sessions are informal in nature and attendees can select sessions to join across the day. Attendees can participate by asking the presenters questions through the chat box and by responding to polls that capture reactions and views on topics.

Schedule and Information about the Virtual Unconferences in Education:

National K12 Student Challenge

ED/IES SBIR awardee Future Engineers (@K12FutureE) launched a nation-wide challenge for K12 students to submit entries to “invent a way to make someone smile or feel appreciated during COVID19.” Teachers can sign up a class to participate or students can participate on their own. See this page for more information and to submit an entry.

Stay tuned to the Inside IES Blog for more information and resources about the response to the COVID19 in education.

 


Edward Metz is a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences. Please contact Edward.Metz@ed.gov with questions or for more information.

 

Bar Chart Race: States With the Highest Public School Enrollment

Recently, bar chart races have become a useful tool to visualize long-term trend changes. The visual below, which uses data from an array of sources, depicts the changes in U.S. public school enrollment from 1870 to 2020 by region: Northeast (green), South (orange), Midwest (light blue), and West (dark blue). Since 1870, states’ populations and public school enrollment have increased, with differential growth across the country.


Source: Report of the Commissioner of Education (1870–71, 1879–80, 1989–90, 1899–1900, and 1909–10); the Biennial Survey of Education in the United States (1919–20, 1929–30, 1939–40, and 1949–50); and the Statistics of State School Systems (1959–60). The intervening earlier years for these decades are estimated by NCES for the purposes of this visual, as are data from 1960 to 1964. Data for 1965 to 1984 are from the Statistics of Public Elementary and Secondary Day Schools. Data for 1985 and later years are from the Common Core of Data and Projections.


Here are some highlights from the data:

  • 1870: All of the top 10 states for public school enrollment—including the top 3 states of New York, Pennsylvania, and Ohio—were in the Northeast and Midwest. No states from the South or West were in the top 10 at this time.
  • 1879: A state in the South—Tennessee—entered the top 10 for the first time.
  • 1884: Texas first entered the top 10 and, as of 2020, has never left the top 10.
  • 1891: Illinois displaced Ohio as the state with the third-highest public school enrollment.
  • 1916: A state in the West—California—entered the top 10 for the first time and, as of 2020, California has never left the top 10.
  • 1935: Texas displaced Illinois as the state with the third-highest public school enrollment. New York and Pennsylvania still remained the two states with the highest public school enrollments.
  • 1942: California displaced Texas as the state with the third-highest public school enrollment. In 1947, California displaced Pennsylvania as the state with the second-highest public school enrollment. In 1953, California overtook New York and became the state with the highest public school enrollment.
  • 1959: Florida entered the top 10 for the first time and was in the top 4 by 1990.
  • 1980: Texas displaced New York as the state with the second-highest public school enrollment.
  • 2014: Florida displaced New York as the state with the third-highest public school enrollment.

Projections indicate that the 10 states with the highest public school enrollment in fall 2020 will be California (6.3 million), Texas (5.5 million), Florida (2.9 million), New York (2.7 million), Illinois (2.0 million), Georgia (1.8 million), Pennsylvania (1.7 million), Ohio (1.7 million), North Carolina (1.6 million), and Michigan (1.5 million).

 

By Rachel Dinkes, AIR

ASSISTments: From Research to Practice at Scale in Education

ASSISTments is a free, web-based formative assessment platform for teachers and students in Grades 3 through 12. The tool is designed for teachers to easily assign students math problems from OER textbooks such as Illustrative Math and EngageNY, existing item banks, or items they have developed on their own. ASSISTments will continually assess students as they solve math problems and provide immediate feedback and the chance to try again. The computer-generated reports provide teachers with information to make real-time adjustments to their instruction. Teachers can use it with their school’s existing learner management systems, such as Google Classroom and Canvas. Watch a video here.

 

 

Over the past 13 years, ASSISTments was developed and evaluated with the support of a series of IES and National Science Foundation awards. With a 2003 IES award to Carnegie Mellon University and Worcester Polytechnic Institute (WPI), researchers created the first version of ASSISTments. The system was populated with Massachusetts high-stakes mathematics test questions and the tutoring for the questions was authored by WPI staff with assistance from local teachers. After students completed problems assigned by the teacher, reports provided teachers with information about question difficulty and the most commonly submitted wrong answers, initiating class discussions around the completed assignments. In 2007, researchers at WPI received an award to build additional functionalities in the ASSISTments program so that teachers could assign supports (called “skill builders”) to students to help them master content.  An additional eight grants allowed the researchers to create other features. 

With a 2012 IES research grant award, SRI evaluated the efficacy of the ASSISTments program as a homework tool for academic learning.  In the study, the researchers took all 7th grade textbooks in the State of Maine and added answers to homework problems into ASSISTments.  The results of the efficacy trial demonstrated that teachers changed their homework reviewing behavior, mathematical learning improved an extra three quarters of a year of schooling, and using ASSISTments reliably closed achievement gaps for students with different achievement levels. ASSISTments is currently being evaluated again through two IES studies, with over 120 schools, to attempt to replicate this result. To view all publications related to ASSISTments, see here.

As of 2020, ASSISTments has been used by approximately 60,000 students with over 12 million problems solved.

 

Interview with Neil Heffernan and Cristina Heffernan

From the start of the project, was it always a goal that ASSISTments would one day be used on a wide scale?

We created ASSISTments to help as many teachers and students as possible. After we learned that the ASSISTments intervention was effective, we set the goal to have every middle school student in the country get immediate feedback on their homework. We created ASSISTments to be used by real teachers and have been improving it with each grant. Because of the effectiveness of ASSISTments, we kept getting funded to make improvements allowing our user base to grow.

At what point was ASSISTments ready to be used at a large scale in schools?

We were ready in year one because of the simplicity of our software. Now that we integrated seamlessly with Google Classroom, most teachers can use the system without training!

ASSISTments is backed by a lot of research, which would make some think that it would be easy for many schools to adopt. What were (or are) the biggest obstacles to ASSISTments being used in more schools?

A big obstacle has been access to technology for all students. The current environment in schools is making that less and less of a barrier. Now, teachers are looking for effective ways to use the computers they have.      

What options did you consider to begin distributing ASSISTments?

We had major companies try to buy us out, but we turned them all down. We knew the value was being in control so we could run research studies, let others run research studies and AB test new ideas. It was important to us to keep ASSISTments free to teachers. It is also a necessity since we crowdsource from teachers.

How do you do marketing?

Our biggest obstacle is marketing. But we are lucky to have just received $1 million in funding from a philanthropy to create a nonprofit to support the work of making our product accessible. Foundation funding has allowed us to hire staff members to write marketing materials including a new website, op-eds, blog posts and press releases. In addition to our internal marketing staff member, we work extensively with The Learning Agency to get press and foundation support for ASSISTments.

What costs are associated with the launch and distribution of ASSISTments, including marketing? Will a revenue model needed sustain ASSISTments over time?

When creating ASSISTments, we didn’t want a traditional business model based on schools paying. Our vision for future growth, instead, focused on crowdsourcing ideas from teachers and testing them. We are trying to replicate the Wikimedia platform idea created by Jimmy Wales. He crowdsources the content that makes up the encyclopedia, so it must be free. We envision using ASSISTments to help us crowdsource hints and explanations for all the commonly used questions in middle school mathematics.  

Do you have any agreement about the IP with the universities where ASSISTments was developed?

The ASSISTments Foundation was founded in 2019 and supports our project work in tandem with Worcester Polytechnic Institute due to our integration with research. The close relationship takes care of any issues that would arise with intellectual property. Additionally, the fact that we are a nonprofit helps address these issues.

How do you describe the experience of commercializing ASSISTments? What would you say is most needed for gaining traction in the marketplace?

Even though we are free, we do have several competitors. To gain traction, we have found that word of mouth is an effective disseminator and our positive efficacy trial result. Currently, there are many teachers on Facebook sharing how much they like ASSISTments. We also attend conferences and are working on an email campaign to get new users onboard.

Do you have advice for university researchers seeking to move their laboratory research into widespread practice?

Make sure your work is accessible and meaningful! We are solving a super-pervasive problem of homework in schools. Everyone finds meaning in making homework better.


Neil Heffernan (@NeilTHeffernan) is a professor of computer science and Director of Learning Sciences and Technologies at Worcester Polytechnic Institute.  He developed ASSISTments not only to help teachers be more effective in the classroom but also so that he could use the platform to conduct studies to improve the quality of education.  

Cristina Heffernan (@CristinaHeff) is the Lead Strategist for the ASSISTments Project at WPI. She began her career in education as a math teacher in the Peace Corps and after went on to teach middle school math in the US.  She began working with teachers while a graduate student at the University of Pittsburgh. As one of the co-founders of ASSISTments, Cristina has nurtured the system to be a tool for teachers to improve what they already do well in the classroom. 


This interview was produced by Edward Metz of the Institute of Education Sciences. This is the fourth in an ongoing series of blog posts examining moving from university research to practice at scale in education.​