Inside IES Research

Notes from NCER & NCSER

Innovating Math Education: Highlights from IES Learning Acceleration Challenges

A teacher and students work on math problems on a white board

The Institute of Education Sciences (IES) held two Learning Acceleration Challenges during the 2022–23 school year, designed to incentivize innovation in math and science. The Math Prize sought school-based, digital interventions to significantly improve math outcomes, specifically in fractions, for upper elementary school students with or at risk for a disability that affects math performance. An unprecedented number of students are performing below grade level in core academic subjects according to the most recent data from the National Assessment of Educational Progress. In response to this problem, the grand prize required interventions to reach an effect size equal to or exceeding 0.77 on a broad measure of math achievement, the NWEA® MAP™ Growth math assessment. The challenge included two phases: In Phase 1, intervention providers submitted information on their interventions and research plans for implementing and testing their interventions under routine conditions. In Phase 2, selected research teams (finalists) were given $25,000 to implement and test their interventions with a shot at receiving the grand prize.

There were four submissions scored by a panel of judges during Phase 1. Two teams were selected to proceed to Phase 2 of the challenge to implement their intervention in schools: The DRUM (Digital Rational Number) Intervention and the ExploreLearning’s Reflex + Frax intervention. These two interventions were implemented in schools between November 2022 and April 2023 and participating students completed the NWEA MAP Growth math assessment before and after implementation. At the completion of Phase 2, the judging panel scored the Phase 2 submissions according to a rigorous set of criteria that included impact (as evaluated by a randomized controlled trial), cost effectiveness, scalability, and sustainability. Based on the scores received by the finalists, the panel did not recommend awarding any Phase 2 Prizes.

We recognize this challenge was an ambitious and rapid effort to improve math achievement. With the knowledge gained from this challenge, we hope to continue to design opportunities that encourage transformative, innovative change within education. While disappointing, these results shed light on some of the challenges of targeting ambitious improvements in student math achievement:

  • The implementation hurdles experienced by both teams reinforce the difficulties of conducting research in schools, especially in the current post-pandemic era climate. In the present circumstances, many schools face extra strains that may make it challenging to implement new interventions, as is required during an RCT.
  • It has historically been, and continues to be, difficult to create accelerated growth in math achievement for students who are with or at risk for disabilities that affect math performance. An improvement in line with the challenge’s 0.77 effect size criterion for the grand prize would substantially lessen the average achievement gap between students with disabilities and their nondisabled peers—and would be no small feat!
  • Barriers still exist to implementation of a technology-based intervention. For intervention developers, the cost and time required to create a digital intervention can be very large. For schools, the necessary infrastructure and acceptance of digital interventions is not always present.
  • Researching interventions within schools takes a lot of time and resources. Sometimes getting answers to our most pressing educational problems takes time, despite the best efforts of those involved to accelerate this process. The results of this competition underscore the continued need for research to support the significant difficulties of this population of learners.

Thank you to all who participated. We would also like to thank Luminary Labs, the contractor providing support for the IES Learning Acceleration Challenges and the two strong partners they included in the work: NWEA and Abt Associates. We appreciate NWEA’s support in conducting the evaluation of the effects of the intervention on the MAP Growth assessment and Abt Associates for their technical assistance during the Phase 2 implementation. We also appreciate all their work to collect and summarize data to understand what we can learn from the challenges and recommendations from other open innovation initiatives to inform future similar work at IES.

If you have an intervention or an idea for an intervention that could accelerate math achievement for students with or at risk for disabilities, you are encouraged to learn more about additional funding opportunities at IES, and contact Sarah Brasiel, program officer for NCSER’s STEM topic area.

This blog was written by Britta Bresina, NCSER program officer.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Bilingüe, Educación y Éxito: Learning from Dual Language Education Programs

April is National Bilingual/Multilingual Learner Advocacy Month! As part of the IES 20th Anniversary celebration, we are highlighting NCER’s investments in field-initiated research. In this guest blog, Drs. Doré LaForett and Ximena Franco-Jenkins (University of North Carolina Chapel Hill) and Adam Winsler (George Mason University) discuss their IES-funded exploration study, some challenges they encountered due to the COVID-19 pandemic, and how their study contributes to supporting multilingual students.

The BEE Project

Our IES-funded study, called the Bilingualism, Education, and Excellence (BEE) project, was born out of a research partnership initiated by a principal of a Spanish-English dual-language (DLE) elementary school. She noticed that student engagement in DLE classrooms seemed to differ depending on the student’s home language and the language of instruction. This got us thinking about how we as a field know very little about what goes on in two-way immersion (TWI) classrooms in terms of teacher language use, student-teacher relationships, student engagement, and learning outcomes for students who speak Spanish or English at home. Therefore, we were excited for the opportunity to dig deeper into links between language of instruction and academic outcomes for students in a relatively new immigrant community like North Carolina. Specifically, we were interested in whether and how the amount of instruction in English and Spanish is related to improvements in student academic outcomes in English and Spanish.

We conducted extensive individual direct student assessments at the beginning and end of the school year, as well as intensive classroom observations to assess both language of instruction and student on-task engagement during both English and Spanish instruction. Although we are still analyzing the data, preliminary findings suggest that language model (90% Spanish/10% English vs. 50% Spanish/50% English), type of 50/50 model used (switching language of instruction mid-day vs alternating days), and initial student language proficiency all matter for student engagement and academic outcomes assessed in English and Spanish. For some outcomes, students with low language proficiency had lower average spring scores when in the 50/50 model compared with students in the 90/10 model. In contrast, students with high language proficiency had higher average spring scores when in the 50/50 model compared with the 90/10 model. In addition, students who speak mostly English at home have a hard time staying engaged on the Spanish day in 50/50 alternate programs.

Impact of COVID-19 on Our Research and Pivots Made

Although we are excited about these findings, like many other studies, we encountered challenges with conducting our study when the pandemic hit. While some studies may have been able to pivot and resume data collection using a remote platform, we had to pause data collection activities during spring 2020 and the 2020-21 school year given our study design and the context in which our research was being conducted. For instance, we used gold-standard, English/Spanish, parallel direct assessments of children which required it to be in person since on-line versions were not available. Also, classroom- and student-level observations were not possible when instruction was remote because, for example, cameras were turned off or there was a lack of access to remote or hybrid learning platforms, due to issues such as contactless video recording technologies that prioritize the talk of only one individual in the classroom rather than the entire class or do not allow for focused observations of individual student behavior.

Therefore, our top priority was maintaining our partnerships with the school districts during the ‘sleeper year.’ We kept in touch and followed our partners’ lead as to when and how we could resume. Meanwhile, we tried to understand what school districts were doing for DLE instruction (in-person, hybrid, remote) during the pandemic. The research team found it necessary to shift tasks during the pandemic, and our efforts were centered on data management and dissemination activities. Once schools started to reopen in 2021-22, our team continued to be patient and flexible to address the health and visitor regulations of the various school districts. In the end, we had one year of data pre-pandemic, one pandemic year without spring data, and one year of data post-pandemic.

Despite these challenges, we used this opportunity to gather information about the learning experiences of students enrolled in the final year of our study, who had been exposed to remote or hybrid learning during the 2020-21 school year. So, when schools reopened in fall 2021, we asked our schools about what instruction was like during the pandemic, and we also asked teachers and parents what they thought about dual language progress during the 2020-21 school year. Teachers were more likely to report that students made good gains in their language skills over that year compared to parents. Further, parents who reported greater English-speaking learning opportunities during remote instruction tended to speak primarily English at home and have more education. Parents who reported that their child had difficulties participating in remote instruction due to technology tended to speak more Spanish at home and have less education.

These findings show how inequities in the home environment, such as those experienced during the pandemic, may have reduced learning opportunities for some students in DLE programs. This is particularly noteworthy because the social experience of language learning is critical in DLE programs, so reduced opportunities to speak in English and Spanish—particularly for students who are not yet fully bilingual or do not live in bilingual homes, can really undermine the goals of DLE programs. These reduced learning opportunities also give us pause as we consider how best to test for cohort effects, choose appropriate procedures for dealing with the missing data, and proceed cautiously with generalizing findings.

A Focus on Diversity, Equity, and Inclusion

Our research is grounded in the cultural mismatch theory, where DLE programs are hypothesized to produce greater alignment or match with English learners’ (ELs’) home environments compared to non-DLE programs. By design, DLE programs that support heritage languages seek to promote bilingualism, bi-literacy, and biculturalism which bolster ELs’ social capital, increase academic performance and reduce the achievement gap for ELs. Thus, effective DLE programs are examples of anti-racist policies and practices. However, some have suggested that DLE programs may be conferring more benefits for White, native English speakers (that is, the Matthew effect, where the rich get richer) compared to the students whose heritage language and culture is being elevated in DLE programs. This is especially concerning given our data showing a potential exacerbation of the Matthew effect during the pandemic due to a variety of factors (lack of access to technology, less-educated families struggling to support their children during remote instruction) suggesting not only learning loss but also language loss. Our research is attempting to open the black box of DLE programs in such classrooms and examine whether experiences, engagement, and outcomes are similar across language backgrounds. We hope that information from our study about the intersection of language proficiency and language of instruction will facilitate decisions regarding how students are assigned to different language models and ultimately support equitable learning opportunities for students attending DLE programs.


Ximena Franco-Jenkins is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

Adam Winsler is an Associate Chair Professor at George Mason University.

Doré R. LaForett is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

This blog was produced by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for the English Learners Portfolio, NCER.

 

Pennsylvania Student Proficiency Rates Rebound Partially from COVID-19-Related Declines

Given the magnitude of the disruption the COVID-19 pandemic caused to education practices, there has been considerable interest in understanding how the pandemic may have affected student proficiency. In this guest blog, Stephen Lipscomb, Duncan Chaplin, Alma Vigil, and Hena Matthias of Mathematica discuss their IES-funded grant project, in partnership with the Pennsylvania Department of Education (PDE), that is looking at the pandemic’s impacts in Pennsylvania.  

The onset of the COVID-19 pandemic in spring 2020 brought on a host of changes to K–12 education and instruction in Pennsylvania. Many local education agencies (LEAs) instituted remote learning and hybrid schedules as their primary mode of educating students, while others maintained in-person learning. Statewide assessments, which were suspended in spring 2020, resumed in 2021 with low participation rates, particularly among students with lower performance before the pandemic. Furthermore, test administration dates varied from spring 2021 to fall 2021. Pennsylvania statewide assessment data reveal that student proficiency rates may have rebounded in 2022, despite remaining below pre-pandemic levels. In grades 5–8, there was a marked increase in proficiency in English language arts (ELA) and a slightly smaller increase in proficiency in math compared to 2021 proficiency rates predicted in recent research. Despite these gains, increasing student proficiency rates to pre-pandemic levels will require additional efforts.

The Pennsylvania Department of Education (PDE) has been committed to providing LEAs with the resources and support necessary to help students achieve pre-pandemic academic proficiency rates. To learn more about changes in how those rates may have been associated with the pandemic, PDE and Mathematica partnered to explore trends in student proficiency data for students in grades 5–8. Given the lower and nonrepresentative participation in the 2021 statewide assessments, as well as the differences in when LEAs administered the assessments, we developed a predictive model of statewide proficiency rates for spring 2021 to produce predicted proficiency rates that would be more comparable to previous and future years. The results revealed that steep declines in proficiency likely occurred between 2019 and 2021 (see Figure 1 below). By spring 2022, proficiency rates in grades 5–8 regained 6 percentage points of their 10 percentage point drop in ELA and nearly 5 percentage points of their 13 percentage point drop in math. Taken together, these results suggest that although the pandemic may have originally been associated with declines in students’ academic proficiency, over time, student proficiency might move back towards pre-pandemic levels.

 

Figure 1. Actual and predicted proficiency rates in grades 5–8 in Pennsylvania, 2015–2022

The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle. Notes: Open circle indicates Statewide assessment cancelled; closed circle indicates predicted proficiency rate. The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle.

Source: Data from 2015–2019 and 2022 are from the Pennsylvania Department of Education. The 2021 data are predicted proficiency rates from Lipscomb et al. (2022a). The figure originally appeared in Lipscomb et al. (2022b).  

 

The next steps for this project will include a strong focus on dissemination of our findings. For example, we will develop a research brief that describes the role of remote learning in shaping academic outcomes beyond proficiency rates and community health outcomes during the pandemic. The findings will help PDE and LEAs refine strategies for supporting vulnerable students and help state policymakers and educators learn from the COVID-19 pandemic—specifically how it might have affected student outcomes and educational inequities.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner for Policy and Systems Division, NCER.

Measuring In-Person Learning During the Pandemic

Some of the most consequential COVID-19-related decisions for public education were those that modified how much in-person learning students received during the 2020-2021 school year. As part of an IES-funded research project in collaboration with the Virginia Department of Education (VDOE) on COVID’s impact on public education in Virginia, researchers at the University of Virginia (UVA) collected data to determine how much in-person learning students in each grade in each division (what Virginia calls its school districts) were offered over the year. In this guest blog, Erica Sachs, an IES predoctoral fellow at UVA, shares brief insights into this work.

Our Process

COVID-19 has caused uncertainty and disruptions in public education for nearly three years. The purpose of the IES-funded study is to describe how Virginia’s response to COVID-19 may have influenced access to instructional opportunities and equity in student outcomes over multiple time periods. This project is a key source of information for the VDOE and Virginia schools’ recovery efforts. An important first step of this work was to uncover how the decisions divisions made impacted student experiences during the 2020-21 school year. This blog focuses on the processes that were undertaken to identify how much in-person learning students could access.

During 2020-21, students were offered school in three learning modalities: fully remote (no in-person learning), fully in-person (only in-person learning), and hybrid (all students could access some in-person learning). Hybrid learning often occurred when schools split a grade into groups and assigned attendance days to each group. For the purposes of the project, we used the term “attendance rotations” to identify whether and which student group(s) could access in-person school on each day of the week. Each attendance rotation is associated with a learning modality.

Most divisions posted information about learning modality and attendance rotations on their official websites, social media, or board meeting documents. In June and July of 2021, our team painstakingly scoured these sites and collected detailed data on the learning modality and attendance rotations of every grade in every division on every day of the school year. We used these data to create a division-by-grade-by-day dataset.

A More Precise Measure of In-Person Learning

An initial examination of the dataset revealed that the commonly used approach of characterizing student experiences by time in each modality masked potentially important variations in the amount of in-person learning accessible in the hybrid modality. For instance, a division could offer one or four days of in-person learning per week, and both would be considered hybrid. To supplement the modality approach, we created a more precise measure of in-person learning using the existing data on attendance rotations. The new variable counts all in-person learning opportunities across the hybrid and fully in-person modalities, and, therefore, captures the variation obscured in the modality-only approach. To illustrate, when looking only at the time in each modality, just 6.7% of the average student’s school year was in the fully in-person modality. However, using the attendance rotations data revealed that the average student had access to in-person learning for one-third of their school year.

Lessons Learned

One of the biggest lessons I learned working on this project was that we drastically underestimated the scope of the data collection and data management undertaking. I hope that sharing some of the lessons I learned will help others doing similar work.

  • Clearly define terminology and keep records of all decisions with examples in a shared file. It will help prevent confusion and resolve disagreements within the team or with partners. Research on COVID-19 in education was relatively new when we started this work. We encountered two terminology-related issues. First, sources used the same term for different concepts, and second, sources used different terms for the same concept. For instance, the VDOE’s definition of the “in-person modality” required four or more days of access to in-person learning weekly, but our team classified four days of access as hybrid because we define “fully in-person modality” as five days of access to in-person learning weekly. Without agreed-upon definitions, people could categorize the same school week under different modalities. Repeated confusion in discussions necessitated a long meeting to hash out definitions, examples, and non-examples of each term and compile them in an organized file.
  • Retroactively collecting data from documents can be difficult if divisions have removed information from their web pages. We found several sources especially helpful in our data collection, including the Wayback Machine, a digital archive of the internet, to access archived division web pages, school board records, including the agenda, meeting minutes, or presentation materials, and announcements or letters to families via divisions’ Facebook or Twitter accounts.
  • To precisely estimate in-person learning across the year, collect data at the division-by-grade-by-day level. Divisions sometimes changed attendance rotations midweek, and the timing of these changes often differed across grades. Consequently, we found that collecting data at the day level was critical to capture all rotation changes and accurately estimate the amount of in-person learning divisions offered students.

What’s Next?

The research brief summarizing our findings can be downloaded from the EdPolicyWorks website. Our team is currently using the in-person learning data as a key measure of division operations during the reopening year to explore how division operations may have varied depending on division characteristics, such as access to high-speed broadband. Additionally, we will leverage the in-person learning metric to examine COVID’s impact on student and teacher outcomes and assess whether trends differed by the amount of in-person learning divisions offered students.


Erica N. Sachs is an MPP/PhD Student, IES Pre-doctoral Fellow, & Graduate Research Assistant at UVA’s EdPolicyWorks.

This blog was produced by Helyn Kim (Helyn.Kim@ed.gov), Program Officer, NCER.