NCES Blog

National Center for Education Statistics

What Do NCES Data Tell Us About America’s Kindergartners?

Happy Get Ready for Kindergarten Month! 

For more than 20 years, the National Center for Education Statistics (NCES) has been collecting information about kindergartners’ knowledge and skills as part of the Early Childhood Longitudinal Studies (ECLS) program.

The first ECLS, the Early Childhood Longitudinal Study, Kindergarten Class of 1998–99 (ECLS-K), focused on children in kindergarten in the 1998–99 school year. At the time the ECLS-K began, no large national study focused on education had followed a cohort of children from kindergarten entry through the elementary school years. Some of today’s commonly known information about young children, such as the information about kindergartners’ early social and academic skills shown in the infographics below, comes out of the ECLS-K. For example, we all know that children arrive at kindergarten with varied knowledge and skills; the ECLS-K was the first study to show at a national level that this was the case and to provide the statistics to highlight the differences in children’s knowledge and skills by various background factors.



The second ECLS kindergarten cohort study, the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011), is the ECLS-K’s sister study. This study followed the students who were in kindergarten during the 2010–11 school year. The ECLS-K:2011, which began more than a decade after the inception of the ECLS-K, allows for comparisons of children in two nationally representative kindergarten classes experiencing different policy, educational, and demographic environments. For example, significant changes that occurred between the start of the ECLS-K and the start of the ECLS-K:2011 include the passage of No Child Left Behind legislation, a rise in school choice, and an increase in English language learners. 

From the parents of children in the ECLS-K:2011, we learned how much U.S. kindergartners like school, as shown in the following infographic.



The ECLS program studies also provide information on children’s home learning environments and experiences outside of school that may contribute to learning. For example, we learned from the ECLS-K:2011 what types of activities kindergartners were doing with their parents at least once a month (see the infographic below).


Infographic titled How do kindergarteners like school?


What’s next for ECLS data collections on kindergartners? NCES is excited to be getting ready to launch our next ECLS kindergarten cohort study, the Early Childhood Longitudinal Study, Kindergarten Class of 2023–24 (ECLS-K:2024)

Before the ECLS-K:2024 national data collections can occur, the ECLS will conduct a field test—a trial run of the study to test the study instruments and procedures—in the fall of 2022.

If you, your child, or your school are selected for the ECLS-K:2024 field test or national study, please participate! While participation is voluntary, it is important so that the study can provide information that can be used at the local, state, and national levels to guide practice and policies that increase every child’s chance of doing well in school. The ECLS-K:2024 will be particularly meaningful, as it will provide important information about the experiences of children whose early lives were shaped by the COVID-19 pandemic.

Watch this video to learn more about participation in the ECLS-K:2024. For more information on the ECLS studies and the data available on our nation’s kindergartners, see the ECLS homepage, review our online training modules, or email the ECLS study team.

 

By Jill Carlivati McCarroll, NCES

Timing is Everything: Understanding the IPEDS Data Collection and Release Cycle

For more than 3 decades, the Integrated Postsecondary Education Data System (IPEDS) has collected data from all postsecondary institutions participating in Title IV federal student aid programs, including universities, community colleges, and vocational and technical schools.

Since 2000, the 12 IPEDS survey components occurring in a given collection year have been organized into three seasonal collection periods: Fall, Winter, and Spring.

The timing of when data are collected (the “collection year”) is most important for the professionals who report their data to the National Center for Education Statistics (NCES). However, IPEDS data users are generally more interested in the year that is actually reflected in the data (the “data year”). As an example, a data user may ask, “What was happening with students, staff, and institutions in 2018–19?"


Text box that says: The collection year refers to the time period the IPEDS survey data are collected. The data year refers to the time period reflected in the IPEDS survey data.


For data users, knowing the difference between the collection year and the data year is important for working with and understanding IPEDS data. Often, the collection year comes after the data year, as institutions need time to collect the required data and check to make sure they are reporting the data accurately. This lag between the time period reflected by the data and when the data are reported is typically one academic term or year, depending on the survey component. For example, fall 2021 enrollment data are not reported to NCES until spring 2022, and the data would not be publicly released until fall 2022.

After the data are collected by NCES, there is an additional time period before they are released publicly in which the data undergo various quality and validity checks. About 9 months after each seasonal collection period ends (i.e., Fall, Winter, Spring), there is a Provisional Data Release and IPEDS data products (e.g., web tools, data files) are updated with the newly released seasonal data. During this provisional release, institutions may revise their data if they believe it was inaccurately reported. A Revised/Final Data Release then happens the following year and includes any revisions that were made to the provisional data.

Sound confusing? The data collection and release cycle can be a technical and complex process, and it varies slightly for each of the 12 IPEDS survey components. Luckily, NCES has created a comprehensive resource page that provides information about the IPEDS data collection and release cycles for each survey component as well as key details for data users and data reporters, such as how to account for summer enrollment in the different IPEDS survey components.

Table 1 provides a summary of the IPEDS 2021–22 data collection and release schedule information that can be found on the resource page. Information on the data year and other details about each survey component can also be found on the resource page.


Table 1. IPEDS 2021–22 Data Collection and Release Schedule

Table showing the IPEDS 2021–22 data collection and release schedule


Here are a few examples of how to distinguish the data year from the collection year in different IPEDS data products.

Example 1: IPEDS Trend Generator

Suppose that a data user is interested in how national graduation rates have changed over time. One tool they might use is the IPEDS Trend Generator. The Trend Generator is a ready-made web tool that allows users to view trends over time on the most frequently asked subject areas in postsecondary education. The Graduation Rate chart below displays data year (shown in green) in the headline and on the x-axis. The “Modify Years” option also allows users to filter by data year. Information about the collection year (shown in gold) can be found in the source notes below the chart.


Image of IPEDS Trend Generator webpage


Example 2: IPEDS Complete Data Files

Imagine that a data user was interested enough in 6-year Graduation Rates that they wanted to run more complex analyses in a statistical program. IPEDS Complete Data Files include all variables for all reporting institutions by survey component and can be downloaded by these users to create their own analytic datasets.

Data users should keep in mind that IPEDS Complete Data Files are organized and released by collection year (shown in gold) rather than data year. Because of this, even though files might share the same collection year, the data years reflected within the files will vary across survey components.


Image of IPEDS Complete Data Files webpage


The examples listed above are just a few of many scenarios in which this distinction between collection year and data year is important for analysis and understanding. Knowing about the IPEDS reporting cycle can be extremely useful when it comes to figuring out how to work with IPEDS data. For more examples and additional details on the IPEDS data collection and release cycles for each survey component, please visit the Timing of IPEDS Data Collection, Coverage, and Release Cycle resource page.

Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube, follow IPEDS on Twitter, and subscribe to the NCES News Flash to stay up-to-date on all IPEDS data releases.

 

By Katie Hyland and Roman Ruiz, American Institutes for Research

Measuring Student Safety: New Data on Bullying Rates at School

Bullying remains a serious issue for students and their families, as well as policy makers, administrators, and educators. NCES is committed to providing reliable and timely data on bullying to measure the extent of the problem and track any progress toward reducing its prevalence. As such, a new set of web tables focusing on bullying rates at school was just released. These tables use data from the School Crime Supplement to the National Crime Victimization Survey, which collects data on bullying by asking a nationally representative sample of students ages 12–18 if they had been bullied at school. This blog post highlights data from these newly released web tables.

In 2019, about 22 percent of students reported being bullied at school during the school year (figure 1). This percentage was lower compared with a decade ago (2009), when 28 percent of students reported being bullied at school.

Students’ reports of being bullied varied based on student and school characteristics in 2019. For instance, a higher percentage of female students than of male students reported being bullied at school during the school year (25 vs. 19 percent). The percentage of students who reported being bullied at school was higher for students of Two or more races (37 percent) than for White students (25 percent) and Black students (22 percent), which were in turn higher than the percentage of Asian students (13 percent). Higher percentages of 6th-, 7th-, and 8th-graders reported being bullied at school (ranging from 27 to 28 percent), compared with 9th-, 10th-, and 12th-graders (ranging from 16 to 19 percent). A higher percentage of students enrolled in schools in rural areas (28 percent) than in schools in other locales (ranging from 21 to 22 percent) reported being bullied at school.


Figure 1. Percentage of students ages 12–18 who reported being bullied at school during the school year, by selected student and school characteristics: 2019

Horizontal bar chart showing the percentage of students ages 12–18 who reported being bullied at school during the school year in 2019, by selected student characteristics (sex, race/ethnicity, and grade) and school characteristics (locale and control of school)

1 Total includes race categories not separately shown.
2 Race categories exclude persons of Hispanic ethnicity. Data for Pacific Islander and American Indian/Alaska Native students did not meet reporting standards in 2019; therefore, data for these two groups are not shown.
3 Excludes students with missing information about the school characteristic.
NOTE: “At school” includes in the school building, on school property, on a school bus, and going to and from school. Although rounded numbers are displayed, the figures are based on unrounded data.
SOURCE: U.S. Department of Justice, Bureau of Justice Statistics, School Crime Supplement (SCS) to the National Crime Victimization Survey, 2019. See Digest of Education Statistics 2020, table 230.40.


Not all students chose to report the bullying to adults at school. Among students ages 12–18 who reported being bullied at school during the school year in 2019, about 46 percent reported notifying an adult at school about the incident. This percentage was higher for Black students than for White students (61 vs. 47 percent), and both percentages were higher than the percentage for Hispanic students (35 percent).

For more details on these data, see the web tables from “Student Reports of Bullying: Results from the 2019 School Crime Supplement to the National Crime Victimization Survey.” For additional information on this topic, see the Condition of Education indicator Bullying at School and Electronic Bullying. For indicators on other topics related to school crime and safety, select “School Crime and Safety” on the Explore by Indicator Topics page.

 

By Ke Wang, AIR

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.

Public State and Local Education Job Openings, Hires, and Separations for December 2021

The Job Openings and Labor Turnover Survey (JOLTS),1 conducted by the U.S. Bureau of Labor Statistics (BLS), provides monthly estimates of job openings, hires, and total separations (quits, layoffs and discharges, and other separations) for major industry sectors, including education. BLS JOLTS data and other survey data can be used to track the recovery of the labor market since the spring of 2020, when the coronavirus pandemic resulted in job losses on a scale not seen since the Great Depression.2

This analysis is the first in a of series of analyses of the public state and local education industry3 during the 2021–22 school year. This industry includes all persons employed by public elementary and secondary school systems and postsecondary institutions, including a variety of occupations, such as teachers and instructional aides, administrators and other professional staff, support staff, maintenance personnel, cafeteria workers, and transportation workers.4 The JOLTS data are tabulated at this sector level and do not permit separate detailed analyses at the elementary and secondary level or at the postsecondary level. To put the scope of this group in context, 48 percent of the staff employed by public elementary and secondary school systems were teachers, and 37 percent of full-time-equivalent (FTE) postsecondary staff within public degree-granting institutions were instructional faculty in 2019.5

This snapshot is focused on the December 2021 reporting period. To provide context for this period, estimates will be compared with the previous month’s estimates, as well as with December 2019 (before the pandemic) and December 2020. Subsequent analysis will review the cumulative change from July 2021 through June 2022.

Overview of December 2021 Estimates

The number of job openings in public state and local education was 320,000 on the last business day of December 2021, which was higher than in December 2019 or December 2020 (table 1). In percentage terms, 2.9 percent of jobs had openings in December 2021, which was higher than 2.0 percent in December 2019 and 1.9 percent in December 2020. This suggests a greater need for public state and local education employees in December 2021 than in December 2019 or December 2020. Additionally, the number of separations6 (126,000) in December 2021 exceeded the total number of hires (91,000), indicating a net decrease in the number of public state and local education employees from the number in the month before. The number of job openings at the end of December 2021 (320,000) was 3.5 times larger than the number of staff actually hired that month (91,000). This December 2021 ratio of openings to hires was higher than the ratio in December 2020 (2.9) and the ratio in December 2019 (2.6).

Hiring in the education sector happens on a cyclical basis with the academic calendar, meaning that patterns will differ between months.7 November 2021 data are also provided in table 1 to provide a sense of the month-to-month change in employment data. In November 2021, the number of job openings outpaced the number of hires by a margin of 167,000 positions, representing a ratio of job openings to hires of 2.3. 


Table 1. Public state and local education job openings, hires, and separations: 2019, 2020, and 2021

Table showing public state and local education job openings, hires, and separations (layoffs and discharges, other separations, and quits)in 2019, 2020, and 2021

---Not available.
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 15, 2022.


Net Change in Employment

JOLTS data show the relationships of hires and separations throughout business cycles. Net employment changes result from the relationship between hires and separations. When the number of hires exceeds the number of separations, employment rises—even if the number of hires is steady or declining. Conversely, when the number of hires is less than the number of separations, employment declines—even if the number of hires is steady or rising. During the 2021 calendar year, hires for state and local education totaled 2,075,000. The number of separations was estimated at 1,622,000 (including 1,009,000 quits). Taken together, the public state and local education sector in 2021 experienced a net employment gain of 453,000. In contrast, there was a net employment loss of 787,000 in 2020, resulting from 1,647,000 hires and 2,434,000 separations. These totals include workers who may have been hired and separated more than once during the year. Annual net gains and losses indicate the importance of being able to consider multiple years of data when studying the overall staffing situation in our education system. The net employment gain in 2021 does not erase the larger net loss experienced in 2020.

Figure 1 shows the cyclical nature of state and local government education employee job openings, hires, and separations. The percentages in the figure reflect the number of job openings, hires, and separations during the month relative to the total employment in the state and local government education industry. In general, separations and hiring are higher in the summer and lower in the winter. Both trends reflect the school fiscal year (July through June).


Figure 1. Monthly percentage of job openings, hires, and separations for the state and local government education industry: January 2019 to December 2021

Line graph showing monthly percentage of job openings, hires, and separations for the state and local government education industry in January, June, and December 2019, June and December 2020, and June and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 15, 2022.


Public State and Local Education Job Openings

Figure 2 shows the job openings in December 2021 compared with those in December 2019 and December 2020 across different industries. Overall, the total nonfarm job opening rate was 6.4 percent in December 2021, which was an increase of 2.3 percentage points over the rate in December 2020. The percentage of public state and local education sector jobs with openings was 2.9 percent (320,000) in December 2021, which was higher than the 2.0 percent (220,000) in December 2019 or 1.9 percent (194,000) in December 2020. The percentage of public state and local education sector job openings in December 2021 was not measurably different from the percentage in November 2021.


Figure 2. Rate of job openings, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of job openings, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Public State and Local Education Hires

Figure 3 shows hires across major industries as a percentage of total employment. Overall, the total nonfarm hire rate was 3.2 percent in December 2021, which was 0.3 percentage points higher than the rate in December 2020. The percentage of public state and local education sector hires was 0.9 percent (91,000) in December 2021, which was not measurably different from the number or rate in either December 2019 or December 2020. The percentage of public state and local education sector hires in December 2021 was lower than the 1.2 percent in November 2021 (133,000).

The gaps between hires and job openings in the public education sector were larger in December 2021 than in 2019 or 2020, due to a larger number of openings in December 2021. In December 2021, the gap between the rates of job openings and hires in education was 2.0 percentage points, compared with 1.2 in both December 2019 and December 2020.


Figure 3. Rate of hires, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of hires, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Public State and Local Education Total Separations

Total separations include quits, layoffs and discharges, and other separations. Quits are generally voluntary separations initiated by the employee. Therefore, the quit rate can serve as a measure of workers’ willingness or ability to leave jobs. Layoffs and discharges are involuntary separations initiated by the employer. The other separations category includes separations due to retirement, death, disability, and transfers to other locations of the same firm.

Total separations for the public state and local education industry were 126,000, or 1.2 percent, in December 2021 (figure 4). Quits accounted for 59 percent of all separations for state and local education employees in December 2021. The quit rate was 0.7 percent for December 2021, which was about 0.2 percentage points higher than in December 2020, but not measurably different from the rate in December 2019. Quit rates for public state and local education employees were consistently lower than for private sector employees.8 For example, in December 2021 the total private sector quit rate was 2.8 percent.


Figure 4. Rate of total separations, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of total separations, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Taken together, the data show that in recent years there generally have been fewer separations in the public education industry compared with other industries. The December 2021 separation rate for state and local education employees of 1.2 percent was higher than the November 2021 separation rate of 0.9 percent. Nevertheless, the separation rate for the state and local education industry was lower than for all other industries in December 2021.

At 2.9 percent, state and local education had the lowest percentage of jobs with openings in December 2021. However, that does not mean that staffing shortages were not a factor in the state and local education industry (figure 5). The ratio of job openings to hires for state and local education (3.5) in December 2021 is well above the average for all industries (2.1), indicating a high demand for employees in this industry and relative difficulty of filling available slots. The only industries with higher openings-to-hires ratios were the federal government (3.9) and state and local government, excluding education (5.6). Thus, while the openings-to-hires ratio was relatively higher for the state and local education industry, it was lower than the ratio for the federal government and for state and local government, excluding education.


Figure 5. Ratio of job openings to hires, by major industry: December 2021

Horizontal bar chart showing ratio of job openings to hires, by major industry, in December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2021, based on data downloaded March 17, 2022.


To understand the cumulative status of the employment situation at the end of the school year, we intend to provide an update of our analyses as these data become available.

Learn more about JOLTS and access additional data on job openings, hires, and separations. Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube to stay informed.

 

By Josue DeLaRosa, NCES


[1] For a discussion on the reliability of the estimates, please see Job Openings and Labor Turnover Technical Note - 2022 M01 Results (bls.gov).

[2] U.S. Department of Labor, Bureau of Labor Statistics, “How Did Employment Change During the COVID-19 pandemic? Evidence From a New BLS Survey Supplement,” downloaded March 18, 2022, from https://www.bls.gov/opub/btn/volume-11/how-did-employment-change-during-the-covid-19-pandemic.htm; and “As the COVID-19 Pandemic Affects the Nation, Hires and Turnover Reach Record Highs in 2020,” downloaded March 18, 2022, from https://www.bls.gov/opub/mlr/2021/article/as-the-covid-19-pandemic-affects-the-nation-hires-and-turnover-reach-record-highs-in-2020.htm.

[3] JOLTS refers to this industry as state and local government education and uses ID 92.

[4] JOLTS does not collect occupation data.

[5] U.S. Department of Education, National Center for Education Statistics, Digest of Education Statistics, table 213.10, downloaded March 30, 2022, from https://nces.ed.gov/programs/digest/d21/tables/dt21_213.10.asp?current=yes, and table 314.10, downloaded March 30, 2022, from https://nces.ed.gov/programs/digest/d20/tables/dt20_314.10.asp?current=yes.

[6] Separations include all separations from the payroll during the entire reference month and are reported by type of separation: quits, layoffs and discharges, and other separations.

[7] Engel, M. (2012). The Timing of Teacher Hires and Teacher Qualifications: Is There an Association? Teachers College Record, 114(12): 1–29.

[8] The private sector includes all nonfarm employees except federal employment and state and local government employment.