IES Blog

Institute of Education Sciences

Celebrating the ECLS-K:2024: Providing Key National Data on Our Country’s Youngest Learners

It’s time to celebrate!

This spring, the Early Childhood Longitudinal Study, Kindergarten Class of 2023–24 (ECLS-K:2024) is wrapping up its first school year of data collection with tens of thousands of children in hundreds of schools across the nation. You may not know this, but NCES is congressionally mandated to collect data on early childhood. We meet that charge by conducting ECLS program studies like the ECLS-K:2024 that follow children through the early elementary grades. Earlier studies looked at children in the kindergarten classes of 1998–99 and 2010–11. We also conducted a study, the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), that followed children from birth through kindergarten entry.

As the newest ECLS program study, the ECLS-K:2024 will collect data from both students and adults in these students’ lives (e.g., parents, teachers, school administrators) to help us better understand how different factors at home and at school relate to children’s development and learning. In fact, the ECLS-K:2024 allows us to provide data not only on the children in the cohort but also on kindergarten teachers and the schools that educate kindergartners.

What we at NCES think is worthy of celebrating is that the ECLS-K:2024—like other ECLS program studies,

  • provides the statistics policymakers need to make data-driven decisions to improve education for all;
  • contributes data that researchers need to answer today’s most pressing questions related to early childhood and early childhood education; and
  • allows us to produce resources for parents, families, teachers, and schools to better inform the public at large about children’s education and development.

Although smaller-scale studies can answer numerous questions about education and development, the ECLS-K:2024 allows us to provide answers at a national level. For example, you may know that children arrive to kindergarten with different skills and abilities, but have you ever wondered how those skills and abilities vary for children who come from different areas of the country? How they vary for children who attended prekindergarten programs versus those who did not? How they vary for children who come from families of different income levels? The national data from the ECLS-K:2024 allow us to dive into these—and other—issues.

The ECLS-K:2024 is unique in that it’s the first of our early childhood studies to provide data on a cohort of students who experienced the coronavirus pandemic. How did the pandemic affect these children’s early development and how did it change the schooling they receive? By comparing the experiences of the ECLS-K:2024 cohort to those of children who were in kindergarten nearly 15 and 25 years ago, we’ll be able to answer these questions.

What’s more, the ECLS-K:2024 will provide information on a variety of topics not fully examined in previous national early childhood studies. The study is including new items on families’ kindergarten selection and choice; availability and use of home computers and other digital devices; parent-teacher association/organization contributions to classrooms; equitable school practices; and a myriad of other constructs.

Earlier ECLS program studies have had a huge impact on our understanding of child development and early education, with hundreds of research publications produced using their data (on topics such as academic skills and school performance; family activities that promote learning; and children’s socioemotional development, physical health, and well-being). ECLS data have also been referenced in media outlets and in federal and state congressional reports. With the launch of the ECLS-K:2024, we cannot wait to see the impact of research using the new data.

Want to learn more? 

Plus, be on the lookout late this spring for the next ECLS blog post celebrating the ECLS-K:2024, which will highlight children in the study. Future blog posts will focus on parents and families and on teachers and schools. Stay tuned!

 

By Jill McCarroll and Korrie Johnson, NCES

Public State and Local Education Job Openings, Hires, and Separations for January 2023

As the primary statistical agency of the U.S. Department of Education, the National Center for Education Statistics (NCES) is mandated to report complete statistics on the condition of American education. While the condition of an education system is often assessed through indicators of achievement and attainment, NCES is also mandated to report on the conditions of the education workplace.

As such, NCES has reported timely information from schools. For example, this past December, NCES released data that indicated that public schools have experienced difficulty filling positions throughout the COVID-19 pandemic.1 In order to understand the broader labor situation, NCES is utilizing the Job Openings and Labor Turnover Survey to describe the tightness of the job market.

JOLTS Design

The Job Openings and Labor Turnover Survey (JOLTS), conducted by the U.S. Bureau of Labor Statistics (BLS), provides monthly estimates of job openings, hires, and total separations. The purpose of JOLTS data is to serve as demand-side indicators of labor shortages at the national level.2

The JOLTS program reports labor demand and turnover estimates by industry, including education.3 As such, this analysis focuses on the public state and local education industry (“state and local government education” as referred to by JOLTS),4 which includes all persons employed by public elementary and secondary school systems and postsecondary institutions.

The JOLTS program does not produce estimates by Standard Occupational Classification.5 When reviewing these findings, please note occupations6 within the public state and local education industry vary7 (e.g., teachers and instructional aides, administrators, cafeteria workers, transportation workers). Furthermore, as the JOLTS data are tabulated at the industry level, the estimates are inclusive of the elementary, secondary, and postsecondary education levels.

Analysis

In this blog post, we present selected estimates on the number and rate of job openings, hires, and total separations (quits, layoffs and discharges, and other separations). The job openings rate is computed by dividing the number of job openings by the sum of employment and job openings. All other metric rates (hires, total separations, quits, layoffs and discharges, and other separations) are defined by taking the number of each metric and dividing it by employment. Fill rate is defined as the ratio of the number of hires to the number of job openings, and the churn rate is defined as the sum of the rate of hires and the rate of total separations.8


Table 1. Number of job openings, hires, and separations and net change in employment in public state and local education, in thousands: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
1 Net employment changes are calculated by taking the difference between the number of hires and the number of separations. When the number of hires exceeds the number of separations, employment rises—even if the number of hires is steady or declining. Conversely, when the number of hires is less than the number of separations, employment declines—even if the number of hires is steady or rising.
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Table 2. Rate of job openings, hires, and separations in public state and local education and fill and churn rates: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Overview of January 2023 Estimates

The number of job openings in public state and local education was 303,000 on the last business day of January 2023, which was higher than in January 2020 (239,000) (table 1). In percentage terms, 2.8 percent of jobs had openings in January 2023, which was higher than in January 2020 (2.2 percent) (table 2). The number of hires in public state and local education was 218,000 in January 2023, which was higher than in January 2020 (177,000) (table 1). This suggests there was a greater demand for public state and local education employees in January 2023 than before the pandemic (January 2020), and there were more people hired in January 2023 than before the pandemic (January 2020). The number of job openings at the end of January 2023 (303,000) was nearly 1.4 times the number of staff hired that month (218,000). In addition, the fill rate for that month was less than 1, which suggests a need for public state and local government education employees that was not being filled completely by January 2023.

The number of total separations in the state and local government education industry in January 2023 was not measurably different from the number of separations observed in January 2020 or January 2022. However, there was a higher number of total separations in January 2023 (127,000) than in January 2021 (57,000), which was nearly a year into the pandemic. In January 2023, the number of quits (76,000) was higher than the number of layoffs and discharges (36,000). Layoffs and discharges accounted for 28 percent of total separations in January 2023 (which was not measurably different from the percentage of layoffs and discharges out of total separations in January 2021), while quits accounted for 60 percent of total separations (which was not measurably different from the percentage of quits out of total separations in January 2021). These data suggest that there were similar distributions in the reasons behind the separations within the state and local government education industry between 2021 and 2023 in the month of January.

 

By Josue DeLaRosa, NCES

 


[1] U.S. Department of Education, National Center for Education Statistics. Forty-Five Percent of Public Schools Operating Without a Full Teaching Staff in October, New NCES Data Show. Retrieved March 28, 2023, from https://nces.ed.gov/whatsnew/press_releases/12_6_2022.asp.
 

[2] U.S. Bureau of Labor Statistics. Job Openings and Labor Turnover Survey. Retrieved March 28, 2023, from https://www.bls.gov/jlt/jltover.htm.

[3] For more information about these estimates, see https://www.bls.gov/news.release/jolts.tn.htm.

[4] JOLTS refers to this industry as state and local government education, which is designated as ID 92.

[5] For more information on the reliability of JOLTS estimates, see https://www.bls.gov/jlt/jltreliability.htm.

[6] North American Industry Classification System (NAICS) is a system for classifying establishments (individual business locations) by type of economic activity. The Standard Occupational Classification (SOC) classifies all occupations for which work is performed for pay or profit. To learn more on the differences between NAICS and SOC, see https://www.census.gov/topics/employment/industry-occupation/about/faq.html.

[7] JOLTS data are establishment based, and there is no distinction between occupations within an industry. If a teacher and a school nurse were hired by an establishment coded as state and local government education, both would fall under that industry. (From email communication with JOLTS staff, April 7, 2023.)

[8] Skopovi, S., Calhoun, P., and Akinyooye, L. Job Openings and Labor Turnover Trends for States in 2020. Beyond the Numbers: Employment & Unemployment, 10(14). Retrieved March 28, 2023, from https://www.bls.gov/opub/btn/volume-10/jolts-2020-state-estimates.htm.

Money Matters: Exploring Young Adults’ Financial Literacy and Financial Discussions With Their Parents

Financial literacy is a critical skill for young adults—especially as they begin to enter college or the workforce—that is often needed for partial or full financial independence and increased financial decision making.

The Program for International Student Assessment (PISA)—which is coordinated by the Organization for Economic Cooperation and Development (OECD)—gives us a unique opportunity to analyze and understand the financial literacy of 15-year-olds in the United States and other education systems around the world. PISA is the only large-scale nationally representative assessment that measures the financial literacy skills of 15-year-olds. The financial literacy domain was administered first in 2012 and then in 2015 and 2018. The 2018 financial literacy cycle assessed approximately 117,000 students, representing about 13.5 million 15-year-olds from 20 education systems. The fourth cycle began in fall 2022 in the United States and is currently being conducted.


How Frequently Do Students Discuss Financial Topics With Their Parents?

In 2018, all education systems that administered the PISA financial literacy assessment also asked students to complete a questionnaire about their experiences with money matters in school and outside of school. In the United States, about 3,500 students out of the total 3,740 U.S. PISA sample completed the questionnaire.

This blog post explores how frequently students reported talking about the following five topics with their parents (or guardians or relatives):

  1. their spending decisions
  2. their savings decisions
  3. the family budget
  4. money for things they want to buy
  5. news related to economics or finance

Students’ answers were grouped into two categories: frequent (“a few times a month” or “once a week or more”) and infrequent (“never or almost never” or “a few times a year”).

We first looked at the degree to which students frequently discussed various financial topics with their parents. In 2018, the frequency of student-parent financial discussions varied by financial topic (figure 1):

  • About one-quarter (24 percent) of U.S. 15-year-old students reported frequently discussing with their parents news related to economics or finance.
  • More than half (53 percent) of U.S. 15-year-old students reported frequently discussing with their parents money for things they wanted to buy.

Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance), in 2018


Do male and female students differ in how frequently they discuss financial topics with their parents?

In 2018, higher percentages of female students than of male students frequently discussed with their parents the family budget (35 vs. 32 percent) and money for things they wanted to buy (56 vs. 50 percent). Meanwhile, a lower percentage of female students than of male students frequently discussed with their parents news related to economics or finance (21 vs. 26 percent) (figure 2).


Bar chart showing percentage of 15-year-old students who frequently discuss financial topics with their parents, by topic (spending decisions, savings decisions, family budget, money for things you want to buy, and news related to economics or finance) and gender, in 2018


Are Students’ Financial Literacy Scores Related to How Frequently They Discuss Financial Matters With Their Parents?

With a scale from 0–1,000, the PISA financial literacy assessment measures students’ financial knowledge in four content areas:

  1. money and transactions
  2. planning and managing finances
  3. risk and reward
  4. the financial landscape

In 2018, the average score of 15-year-old students ranged from 388 points in Indonesia to 547 points in Estonia. The U.S. average (506 points) was higher than the average in 11 education systems, lower than the average in 4 education systems, and not measurably different from the average in 4 education systems. The U.S. average was also not measurably different from the OECD average.

We also examined the relationship between frequent parent–student financial discussions and students’ financial literacy achievement (figure 3). After taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status—as well as their school’s poverty and location—the results show that students who reported frequently discussing spending decisions with their parents scored 16 points higher on average than did students who reported infrequently discussing this topic. On the other hand, students who reported frequently discussing news related to economics or finance with their parents scored 18 points lower on average than did students who reported infrequently discussing this topic.  


Two-sided horizontal bar chart showing financial literacy score-point differences between students who frequently and infrequently discuss financial topics with their parents, after accounting for student and school characteristics, in 2018


Do Students Think That Young Adults Should Make Their Own Spending Decisions?

We also explored whether students agreed that young people should make their own spending decisions. In 2018, some 63 percent of U.S. 15-year-old students reported they agreed or strongly agreed, while 37 percent reported that they disagreed.

Do male and female students differ in their agreement that young adults should make their own spending decisions?

When comparing the percentage of male versus female students, we found that a lower percentage of female students than of male students agreed or strongly agreed that young people should make their own spending decisions (59 vs. 66 percent). This pattern held even after taking into account students’ gender, race/ethnicity, immigration status, and socioeconomic status as well as school poverty and location.  


Upcoming PISA Data Collections

A deeper understanding of the frequency of parent–student financial conversations, the types of topics discussed, and the relationships between financial topics and financial literacy could help parents and educators foster financial literacy across different student groups in the United States.

PISA began collecting data in 2022 after being postponed 1 year due to the COVID-19 pandemic; 83 education systems are expected to participate. The PISA 2022 Financial Literacy Assessment will include items from earlier years as well as new interactive items. The main PISA results will be released in December 2023, and the PISA financial literacy results will be released in spring/summer 2024.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to receive notifications when these new PISA data are released.

 

By Saki Ikoma, Marissa Hall, and Frank Fonseca, AIR

International Computer and Information Literacy Study: 2023 Data Collection

In April, the National Center for Education Statistics (NCES) will kick off the 2023 International Computer and Information Literacy Study (ICILS) of eighth-grade students in the United States. This will be the second time the United States is participating in the ICILS.

What is ICILS?

ICILS is a computer-based international assessment of eighth-grade students’ capacity to use information and communications technologies (ICT)1 productively for a range of different purposes. It is sponsored by the International Association for the Evaluation of Educational Achievement (IEA) and conducted in the United States by NCES.

In addition to assessing students on two components—computer and information literacy (CIL) and computational thinking (CT)—ICILS also collects information from students, teachers, school principals, and ICT coordinators on contextual factors that may be related to students’ development in CIL.

Why is ICILS important?

ICILS measures students’ skills with ICT and provides data on CIL. In the United States, the development of these skills is called for in the Federal STEM Education Strategic Plan. Outside of the United States, ICILS is also recognized as an official EU target by the European Council and EU member states to support strategic priorities toward the European Education Area and Beyond (2021–2030). From a global perspective, ICILS provides information for monitoring progress toward the UNESCO Sustainable Development Goals (SDGs).

The measurement of students’ CIL is highly relevant today—digital tools and online learning became the primary means of delivering and receiving education during the onset of the coronavirus pandemic, and technology continually shapes the way students learn both inside and outside of school.

ICILS provides valuable comparative data on students’ skills and experience across all participating education systems. In 2018, ICILS results showed that U.S. eighth-grade students’ average CIL score (519) was higher than the ICILS 2018 average score (496) (figure 1).


Horizontal bar chart showing average CIL scores of eighth-grade students, by education system, in 2018

* p < .05. Significantly different from the U.S. estimate at the .05 level of statistical significance.
NOTE: CIL = computer and information literacy. The ICILS CIL scale ranges from 100 to 700. The ICILS 2018 average is the average of all participating education systems meeting international technical standards, with each education system weighted equally. Education systems are ordered by their average CIL scores, from largest to smallest. Italics indicate the benchmarking participants.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018.


ICILS data can also be used to examine various topics within one education system and shed light on the variations in the use of digital resources in teaching and learning among student and teacher subgroups. For example, in 2018, lower percentages of mathematics teachers than of English language arts (ELA) and science teachers often or always used ICT to support student-led discussions, inquiry learning, and collaboration among students (figure 2).


Stacked horizontal bar chart showing percentage of U.S. eighth-grade teachers who often or always use ICT, by selected teaching practice and subject (English language arts, math, and science), in 2018

NOTE: ICT = information and communications technologies. Teaching practices are ordered by the percentage of English language arts teachers using ICT, from largest to smallest. Science includes general science and/or physics, chemistry, biology, geology, earth sciences, and technical science.
SOURCE: International Association for the Evaluation of Educational Achievement (IEA), International Computer and Information Literacy Study (ICILS), 2018.


What does the ICILS 2023 data collection include?

In November 2022, NCES started the preparation work for the ICILS 2023 main study data collection, which is scheduled for administration from April to June 2023. Eighth-grade students and staff from a nationally representative sample of about 150 schools will participate in the study.

Students will be assessed on CIL (which focuses on understanding computer use, gathering information, producing information, and communicating digitally) and CT (which focuses on conceptualizing problems and operationalizing solutions). In addition to taking the assessment, students will complete a questionnaire about their access to and use of ICT.

Teachers will be surveyed about their use of ICT in teaching practices, ICT skills they emphasize in their teaching, their attitudes toward using ICT, and their ICT-related professional development. In addition, principals and ICT coordinators will be surveyed about ICT resources and support at school, priorities in using ICT, and management of ICT resources.

In 2023, more than 30 education systems will participate in the study and join the international comparisons. When ICILS 2023 results are released in the international and U.S. reports in November 2024, we will be able to learn more about the changes in students’ and teachers’ technology use over the past 5 years by comparing the 2023 and 2018 ICILS results. Such trend comparisons will be meaningful given the increased availability of the Internet and digital tools during the pandemic.

 

Explore the ICILS website to learn more about the study, and be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on future ICILS reports and resources.

 

By Yan Wang and Yuqi Liao, AIR

 


[1] Refers to technological tools and resources used to store, create, share, or exchange information, including computers, software applications, and the Internet.

U.S. Is Unique in Score Gap Widening in Mathematics and Science at Both Grades 4 and 8: Prepandemic Evidence from TIMSS

Tracking differences between the performance of high- and low-performing students is one way of monitoring equity in education. These differences are referred to as achievement gaps or “score gaps,” and they may widen or narrow over time.

To provide the most up-to-date international data on this topic, NCES recently released Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS. This interactive web-based Stats in Brief uses data from the Trends in International Mathematics and Science Study (TIMSS) to explore changes between 2011 and 2019 in the score gaps between students at the 90th percentile (high performing) and the 10th percentile (low performing). The study—which examines data from 47 countries at grade 4, 36 countries at grade 8, and 29 countries at both grades—provides an important picture of prepandemic trends.

This Stats in Brief also provides new analyses of the patterns in score gap changes over the last decade. The focus on patterns sheds light on which part of the achievement distribution may be driving change, which is important for developing appropriate policy responses. 


Did score gaps change in the United States and other countries between 2011 and 2019?

In the United States, score gap changes consistently widened between 2011 and 2019 (figure 1). In fact, the United States was the only country (of 29) where the score gap between high- and low-performing students widened in both mathematics and science at both grade 4 and grade 8.


Figure 1. Changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores gaps between high- and low-performing U.S. students between 2011 and 2019

* p < .05. Change in score gap is significant at the .05 level of statistical significance.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


For any given grade and subject combination, no more than a quarter of participating countries had a score gap that widened, and no more than a third had a score gap that narrowed—further highlighting the uniqueness of the U.S. results.


Did score gaps change because of high-performing students, low-performing students, or both?

At grade 4, score gaps widened in the United States between 2011 and 2019 due to decreases in low-performing students’ scores, while high-performing students’ scores did not measurably change (figure 2). This was true for both mathematics and science and for most of the countries where score gaps also widened.


Figure 2. Changes in scores of high- and low-performing U.S. students between 2011 and 2019

Horizontal bar chart showing changes in scores of high- and low-performing U.S. students between 2011 and 2019 and changes in the corresponding score gaps

p < .05. 2019 score gap is significantly different from 2011 score gap.

SOURCE: Stephens, M., Erberber, E., Tsokodayi, Y., and Fonseca, F. (2022). Changes Between 2011 and 2019 in Achievement Gaps Between High- and Low-Performing Students in Mathematics and Science: International Results From TIMSS (NCES 2022-041). U.S. Department of Education. Washington, DC: National Center for Education Statistics, Institute of Education Sciences. Available at https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022041.


Low-performing U.S. students’ scores also dropped in both subjects at grade 8, but at this grade, they were accompanied by rises in high-performing students’ scores. This pattern—where the two ends of the distribution move in opposite directions—led to the United States’ relatively large changes in score gaps. Among the other countries with widening score gaps at grade 8, this pattern of divergence was not common in mathematics but was more common in science.

In contrast, in countries where the score gaps narrowed, low-performing students’ scores generally increased. In some cases, the scores of both low- and high-performing students increased, but the scores of low-performing students increased more.

Countries with narrowing score gaps typically also saw their average scores rise between 2011 and 2019, demonstrating improvements in both equity and achievement. This was almost never the case in countries where the scores of low-performing students dropped, highlighting the global importance of not letting this group of students fall behind.  


What else can we learn from this TIMSS Stats in Brief?

In addition to providing summary results (described above), this interactive Stats in Brief allows users to select a subject and grade to explore each of the study questions further (exhibit 1). Within each selection, users can choose either a more streamlined or a more expanded view of the cross-country figures and walk through the findings step-by-step while key parts of the figures are highlighted.


Exhibit 1. Preview of the Stats in Brief’s Features

Image of the TIMSS Stats in Brief web report


Explore NCES’ new interactive TIMSS Stats in Brief to learn more about how score gaps between high- and low-performing students have changed over time across countries.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on TIMSS data releases and resources.

 

By Maria Stephens and Ebru Erberber, AIR; and Lydia Malley, NCES