IES Blog

Institute of Education Sciences

Celebrating the ECLS-K:2024: Providing Key National Data on Our Country’s Youngest Learners

It’s time to celebrate!

This spring, the Early Childhood Longitudinal Study, Kindergarten Class of 2023–24 (ECLS-K:2024) is wrapping up its first school year of data collection with tens of thousands of children in hundreds of schools across the nation. You may not know this, but NCES is congressionally mandated to collect data on early childhood. We meet that charge by conducting ECLS program studies like the ECLS-K:2024 that follow children through the early elementary grades. Earlier studies looked at children in the kindergarten classes of 1998–99 and 2010–11. We also conducted a study, the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), that followed children from birth through kindergarten entry.

As the newest ECLS program study, the ECLS-K:2024 will collect data from both students and adults in these students’ lives (e.g., parents, teachers, school administrators) to help us better understand how different factors at home and at school relate to children’s development and learning. In fact, the ECLS-K:2024 allows us to provide data not only on the children in the cohort but also on kindergarten teachers and the schools that educate kindergartners.

What we at NCES think is worthy of celebrating is that the ECLS-K:2024—like other ECLS program studies,

  • provides the statistics policymakers need to make data-driven decisions to improve education for all;
  • contributes data that researchers need to answer today’s most pressing questions related to early childhood and early childhood education; and
  • allows us to produce resources for parents, families, teachers, and schools to better inform the public at large about children’s education and development.

Although smaller-scale studies can answer numerous questions about education and development, the ECLS-K:2024 allows us to provide answers at a national level. For example, you may know that children arrive to kindergarten with different skills and abilities, but have you ever wondered how those skills and abilities vary for children who come from different areas of the country? How they vary for children who attended prekindergarten programs versus those who did not? How they vary for children who come from families of different income levels? The national data from the ECLS-K:2024 allow us to dive into these—and other—issues.

The ECLS-K:2024 is unique in that it’s the first of our early childhood studies to provide data on a cohort of students who experienced the coronavirus pandemic. How did the pandemic affect these children’s early development and how did it change the schooling they receive? By comparing the experiences of the ECLS-K:2024 cohort to those of children who were in kindergarten nearly 15 and 25 years ago, we’ll be able to answer these questions.

What’s more, the ECLS-K:2024 will provide information on a variety of topics not fully examined in previous national early childhood studies. The study is including new items on families’ kindergarten selection and choice; availability and use of home computers and other digital devices; parent-teacher association/organization contributions to classrooms; equitable school practices; and a myriad of other constructs.

Earlier ECLS program studies have had a huge impact on our understanding of child development and early education, with hundreds of research publications produced using their data (on topics such as academic skills and school performance; family activities that promote learning; and children’s socioemotional development, physical health, and well-being). ECLS data have also been referenced in media outlets and in federal and state congressional reports. With the launch of the ECLS-K:2024, we cannot wait to see the impact of research using the new data.

Want to learn more? 

Plus, be on the lookout late this spring for the next ECLS blog post celebrating the ECLS-K:2024, which will highlight children in the study. Future blog posts will focus on parents and families and on teachers and schools. Stay tuned!

 

By Jill McCarroll and Korrie Johnson, NCES

Innovating Math Education: Highlights from IES Learning Acceleration Challenges

A teacher and students work on math problems on a white board

The Institute of Education Sciences (IES) held two Learning Acceleration Challenges during the 2022–23 school year, designed to incentivize innovation in math and science. The Math Prize sought school-based, digital interventions to significantly improve math outcomes, specifically in fractions, for upper elementary school students with or at risk for a disability that affects math performance. An unprecedented number of students are performing below grade level in core academic subjects according to the most recent data from the National Assessment of Educational Progress. In response to this problem, the grand prize required interventions to reach an effect size equal to or exceeding 0.77 on a broad measure of math achievement, the NWEA® MAP™ Growth math assessment. The challenge included two phases: In Phase 1, intervention providers submitted information on their interventions and research plans for implementing and testing their interventions under routine conditions. In Phase 2, selected research teams (finalists) were given $25,000 to implement and test their interventions with a shot at receiving the grand prize.

There were four submissions scored by a panel of judges during Phase 1. Two teams were selected to proceed to Phase 2 of the challenge to implement their intervention in schools: The DRUM (Digital Rational Number) Intervention and the ExploreLearning’s Reflex + Frax intervention. These two interventions were implemented in schools between November 2022 and April 2023 and participating students completed the NWEA MAP Growth math assessment before and after implementation. At the completion of Phase 2, the judging panel scored the Phase 2 submissions according to a rigorous set of criteria that included impact (as evaluated by a randomized controlled trial), cost effectiveness, scalability, and sustainability. Based on the scores received by the finalists, the panel did not recommend awarding any Phase 2 Prizes.

We recognize this challenge was an ambitious and rapid effort to improve math achievement. With the knowledge gained from this challenge, we hope to continue to design opportunities that encourage transformative, innovative change within education. While disappointing, these results shed light on some of the challenges of targeting ambitious improvements in student math achievement:

  • The implementation hurdles experienced by both teams reinforce the difficulties of conducting research in schools, especially in the current post-pandemic era climate. In the present circumstances, many schools face extra strains that may make it challenging to implement new interventions, as is required during an RCT.
  • It has historically been, and continues to be, difficult to create accelerated growth in math achievement for students who are with or at risk for disabilities that affect math performance. An improvement in line with the challenge’s 0.77 effect size criterion for the grand prize would substantially lessen the average achievement gap between students with disabilities and their nondisabled peers—and would be no small feat!
  • Barriers still exist to implementation of a technology-based intervention. For intervention developers, the cost and time required to create a digital intervention can be very large. For schools, the necessary infrastructure and acceptance of digital interventions is not always present.
  • Researching interventions within schools takes a lot of time and resources. Sometimes getting answers to our most pressing educational problems takes time, despite the best efforts of those involved to accelerate this process. The results of this competition underscore the continued need for research to support the significant difficulties of this population of learners.

Thank you to all who participated. We would also like to thank Luminary Labs, the contractor providing support for the IES Learning Acceleration Challenges and the two strong partners they included in the work: NWEA and Abt Associates. We appreciate NWEA’s support in conducting the evaluation of the effects of the intervention on the MAP Growth assessment and Abt Associates for their technical assistance during the Phase 2 implementation. We also appreciate all their work to collect and summarize data to understand what we can learn from the challenges and recommendations from other open innovation initiatives to inform future similar work at IES.

If you have an intervention or an idea for an intervention that could accelerate math achievement for students with or at risk for disabilities, you are encouraged to learn more about additional funding opportunities at IES, and contact Sarah Brasiel, program officer for NCSER’s STEM topic area.

This blog was written by Britta Bresina, NCSER program officer.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Public State and Local Education Job Openings, Hires, and Separations for January 2023

As the primary statistical agency of the U.S. Department of Education, the National Center for Education Statistics (NCES) is mandated to report complete statistics on the condition of American education. While the condition of an education system is often assessed through indicators of achievement and attainment, NCES is also mandated to report on the conditions of the education workplace.

As such, NCES has reported timely information from schools. For example, this past December, NCES released data that indicated that public schools have experienced difficulty filling positions throughout the COVID-19 pandemic.1 In order to understand the broader labor situation, NCES is utilizing the Job Openings and Labor Turnover Survey to describe the tightness of the job market.

JOLTS Design

The Job Openings and Labor Turnover Survey (JOLTS), conducted by the U.S. Bureau of Labor Statistics (BLS), provides monthly estimates of job openings, hires, and total separations. The purpose of JOLTS data is to serve as demand-side indicators of labor shortages at the national level.2

The JOLTS program reports labor demand and turnover estimates by industry, including education.3 As such, this analysis focuses on the public state and local education industry (“state and local government education” as referred to by JOLTS),4 which includes all persons employed by public elementary and secondary school systems and postsecondary institutions.

The JOLTS program does not produce estimates by Standard Occupational Classification.5 When reviewing these findings, please note occupations6 within the public state and local education industry vary7 (e.g., teachers and instructional aides, administrators, cafeteria workers, transportation workers). Furthermore, as the JOLTS data are tabulated at the industry level, the estimates are inclusive of the elementary, secondary, and postsecondary education levels.

Analysis

In this blog post, we present selected estimates on the number and rate of job openings, hires, and total separations (quits, layoffs and discharges, and other separations). The job openings rate is computed by dividing the number of job openings by the sum of employment and job openings. All other metric rates (hires, total separations, quits, layoffs and discharges, and other separations) are defined by taking the number of each metric and dividing it by employment. Fill rate is defined as the ratio of the number of hires to the number of job openings, and the churn rate is defined as the sum of the rate of hires and the rate of total separations.8


Table 1. Number of job openings, hires, and separations and net change in employment in public state and local education, in thousands: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
1 Net employment changes are calculated by taking the difference between the number of hires and the number of separations. When the number of hires exceeds the number of separations, employment rises—even if the number of hires is steady or declining. Conversely, when the number of hires is less than the number of separations, employment declines—even if the number of hires is steady or rising.
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Table 2. Rate of job openings, hires, and separations in public state and local education and fill and churn rates: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Overview of January 2023 Estimates

The number of job openings in public state and local education was 303,000 on the last business day of January 2023, which was higher than in January 2020 (239,000) (table 1). In percentage terms, 2.8 percent of jobs had openings in January 2023, which was higher than in January 2020 (2.2 percent) (table 2). The number of hires in public state and local education was 218,000 in January 2023, which was higher than in January 2020 (177,000) (table 1). This suggests there was a greater demand for public state and local education employees in January 2023 than before the pandemic (January 2020), and there were more people hired in January 2023 than before the pandemic (January 2020). The number of job openings at the end of January 2023 (303,000) was nearly 1.4 times the number of staff hired that month (218,000). In addition, the fill rate for that month was less than 1, which suggests a need for public state and local government education employees that was not being filled completely by January 2023.

The number of total separations in the state and local government education industry in January 2023 was not measurably different from the number of separations observed in January 2020 or January 2022. However, there was a higher number of total separations in January 2023 (127,000) than in January 2021 (57,000), which was nearly a year into the pandemic. In January 2023, the number of quits (76,000) was higher than the number of layoffs and discharges (36,000). Layoffs and discharges accounted for 28 percent of total separations in January 2023 (which was not measurably different from the percentage of layoffs and discharges out of total separations in January 2021), while quits accounted for 60 percent of total separations (which was not measurably different from the percentage of quits out of total separations in January 2021). These data suggest that there were similar distributions in the reasons behind the separations within the state and local government education industry between 2021 and 2023 in the month of January.

 

By Josue DeLaRosa, NCES

 


[1] U.S. Department of Education, National Center for Education Statistics. Forty-Five Percent of Public Schools Operating Without a Full Teaching Staff in October, New NCES Data Show. Retrieved March 28, 2023, from https://nces.ed.gov/whatsnew/press_releases/12_6_2022.asp.
 

[2] U.S. Bureau of Labor Statistics. Job Openings and Labor Turnover Survey. Retrieved March 28, 2023, from https://www.bls.gov/jlt/jltover.htm.

[3] For more information about these estimates, see https://www.bls.gov/news.release/jolts.tn.htm.

[4] JOLTS refers to this industry as state and local government education, which is designated as ID 92.

[5] For more information on the reliability of JOLTS estimates, see https://www.bls.gov/jlt/jltreliability.htm.

[6] North American Industry Classification System (NAICS) is a system for classifying establishments (individual business locations) by type of economic activity. The Standard Occupational Classification (SOC) classifies all occupations for which work is performed for pay or profit. To learn more on the differences between NAICS and SOC, see https://www.census.gov/topics/employment/industry-occupation/about/faq.html.

[7] JOLTS data are establishment based, and there is no distinction between occupations within an industry. If a teacher and a school nurse were hired by an establishment coded as state and local government education, both would fall under that industry. (From email communication with JOLTS staff, April 7, 2023.)

[8] Skopovi, S., Calhoun, P., and Akinyooye, L. Job Openings and Labor Turnover Trends for States in 2020. Beyond the Numbers: Employment & Unemployment, 10(14). Retrieved March 28, 2023, from https://www.bls.gov/opub/btn/volume-10/jolts-2020-state-estimates.htm.

Bilingüe, Educación y Éxito: Learning from Dual Language Education Programs

April is National Bilingual/Multilingual Learner Advocacy Month! As part of the IES 20th Anniversary celebration, we are highlighting NCER’s investments in field-initiated research. In this guest blog, Drs. Doré LaForett and Ximena Franco-Jenkins (University of North Carolina Chapel Hill) and Adam Winsler (George Mason University) discuss their IES-funded exploration study, some challenges they encountered due to the COVID-19 pandemic, and how their study contributes to supporting multilingual students.

The BEE Project

Our IES-funded study, called the Bilingualism, Education, and Excellence (BEE) project, was born out of a research partnership initiated by a principal of a Spanish-English dual-language (DLE) elementary school. She noticed that student engagement in DLE classrooms seemed to differ depending on the student’s home language and the language of instruction. This got us thinking about how we as a field know very little about what goes on in two-way immersion (TWI) classrooms in terms of teacher language use, student-teacher relationships, student engagement, and learning outcomes for students who speak Spanish or English at home. Therefore, we were excited for the opportunity to dig deeper into links between language of instruction and academic outcomes for students in a relatively new immigrant community like North Carolina. Specifically, we were interested in whether and how the amount of instruction in English and Spanish is related to improvements in student academic outcomes in English and Spanish.

We conducted extensive individual direct student assessments at the beginning and end of the school year, as well as intensive classroom observations to assess both language of instruction and student on-task engagement during both English and Spanish instruction. Although we are still analyzing the data, preliminary findings suggest that language model (90% Spanish/10% English vs. 50% Spanish/50% English), type of 50/50 model used (switching language of instruction mid-day vs alternating days), and initial student language proficiency all matter for student engagement and academic outcomes assessed in English and Spanish. For some outcomes, students with low language proficiency had lower average spring scores when in the 50/50 model compared with students in the 90/10 model. In contrast, students with high language proficiency had higher average spring scores when in the 50/50 model compared with the 90/10 model. In addition, students who speak mostly English at home have a hard time staying engaged on the Spanish day in 50/50 alternate programs.

Impact of COVID-19 on Our Research and Pivots Made

Although we are excited about these findings, like many other studies, we encountered challenges with conducting our study when the pandemic hit. While some studies may have been able to pivot and resume data collection using a remote platform, we had to pause data collection activities during spring 2020 and the 2020-21 school year given our study design and the context in which our research was being conducted. For instance, we used gold-standard, English/Spanish, parallel direct assessments of children which required it to be in person since on-line versions were not available. Also, classroom- and student-level observations were not possible when instruction was remote because, for example, cameras were turned off or there was a lack of access to remote or hybrid learning platforms, due to issues such as contactless video recording technologies that prioritize the talk of only one individual in the classroom rather than the entire class or do not allow for focused observations of individual student behavior.

Therefore, our top priority was maintaining our partnerships with the school districts during the ‘sleeper year.’ We kept in touch and followed our partners’ lead as to when and how we could resume. Meanwhile, we tried to understand what school districts were doing for DLE instruction (in-person, hybrid, remote) during the pandemic. The research team found it necessary to shift tasks during the pandemic, and our efforts were centered on data management and dissemination activities. Once schools started to reopen in 2021-22, our team continued to be patient and flexible to address the health and visitor regulations of the various school districts. In the end, we had one year of data pre-pandemic, one pandemic year without spring data, and one year of data post-pandemic.

Despite these challenges, we used this opportunity to gather information about the learning experiences of students enrolled in the final year of our study, who had been exposed to remote or hybrid learning during the 2020-21 school year. So, when schools reopened in fall 2021, we asked our schools about what instruction was like during the pandemic, and we also asked teachers and parents what they thought about dual language progress during the 2020-21 school year. Teachers were more likely to report that students made good gains in their language skills over that year compared to parents. Further, parents who reported greater English-speaking learning opportunities during remote instruction tended to speak primarily English at home and have more education. Parents who reported that their child had difficulties participating in remote instruction due to technology tended to speak more Spanish at home and have less education.

These findings show how inequities in the home environment, such as those experienced during the pandemic, may have reduced learning opportunities for some students in DLE programs. This is particularly noteworthy because the social experience of language learning is critical in DLE programs, so reduced opportunities to speak in English and Spanish—particularly for students who are not yet fully bilingual or do not live in bilingual homes, can really undermine the goals of DLE programs. These reduced learning opportunities also give us pause as we consider how best to test for cohort effects, choose appropriate procedures for dealing with the missing data, and proceed cautiously with generalizing findings.

A Focus on Diversity, Equity, and Inclusion

Our research is grounded in the cultural mismatch theory, where DLE programs are hypothesized to produce greater alignment or match with English learners’ (ELs’) home environments compared to non-DLE programs. By design, DLE programs that support heritage languages seek to promote bilingualism, bi-literacy, and biculturalism which bolster ELs’ social capital, increase academic performance and reduce the achievement gap for ELs. Thus, effective DLE programs are examples of anti-racist policies and practices. However, some have suggested that DLE programs may be conferring more benefits for White, native English speakers (that is, the Matthew effect, where the rich get richer) compared to the students whose heritage language and culture is being elevated in DLE programs. This is especially concerning given our data showing a potential exacerbation of the Matthew effect during the pandemic due to a variety of factors (lack of access to technology, less-educated families struggling to support their children during remote instruction) suggesting not only learning loss but also language loss. Our research is attempting to open the black box of DLE programs in such classrooms and examine whether experiences, engagement, and outcomes are similar across language backgrounds. We hope that information from our study about the intersection of language proficiency and language of instruction will facilitate decisions regarding how students are assigned to different language models and ultimately support equitable learning opportunities for students attending DLE programs.


Ximena Franco-Jenkins is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

Adam Winsler is an Associate Chair Professor at George Mason University.

Doré R. LaForett is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

This blog was produced by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for the English Learners Portfolio, NCER.