IES Blog

Institute of Education Sciences

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.

Public State and Local Education Job Openings, Hires, and Separations for December 2021

The Job Openings and Labor Turnover Survey (JOLTS),1 conducted by the U.S. Bureau of Labor Statistics (BLS), provides monthly estimates of job openings, hires, and total separations (quits, layoffs and discharges, and other separations) for major industry sectors, including education. BLS JOLTS data and other survey data can be used to track the recovery of the labor market since the spring of 2020, when the coronavirus pandemic resulted in job losses on a scale not seen since the Great Depression.2

This analysis is the first in a of series of analyses of the public state and local education industry3 during the 2021–22 school year. This industry includes all persons employed by public elementary and secondary school systems and postsecondary institutions, including a variety of occupations, such as teachers and instructional aides, administrators and other professional staff, support staff, maintenance personnel, cafeteria workers, and transportation workers.4 The JOLTS data are tabulated at this sector level and do not permit separate detailed analyses at the elementary and secondary level or at the postsecondary level. To put the scope of this group in context, 48 percent of the staff employed by public elementary and secondary school systems were teachers, and 37 percent of full-time-equivalent (FTE) postsecondary staff within public degree-granting institutions were instructional faculty in 2019.5

This snapshot is focused on the December 2021 reporting period. To provide context for this period, estimates will be compared with the previous month’s estimates, as well as with December 2019 (before the pandemic) and December 2020. Subsequent analysis will review the cumulative change from July 2021 through June 2022.

Overview of December 2021 Estimates

The number of job openings in public state and local education was 320,000 on the last business day of December 2021, which was higher than in December 2019 or December 2020 (table 1). In percentage terms, 2.9 percent of jobs had openings in December 2021, which was higher than 2.0 percent in December 2019 and 1.9 percent in December 2020. This suggests a greater need for public state and local education employees in December 2021 than in December 2019 or December 2020. Additionally, the number of separations6 (126,000) in December 2021 exceeded the total number of hires (91,000), indicating a net decrease in the number of public state and local education employees from the number in the month before. The number of job openings at the end of December 2021 (320,000) was 3.5 times larger than the number of staff actually hired that month (91,000). This December 2021 ratio of openings to hires was higher than the ratio in December 2020 (2.9) and the ratio in December 2019 (2.6).

Hiring in the education sector happens on a cyclical basis with the academic calendar, meaning that patterns will differ between months.7 November 2021 data are also provided in table 1 to provide a sense of the month-to-month change in employment data. In November 2021, the number of job openings outpaced the number of hires by a margin of 167,000 positions, representing a ratio of job openings to hires of 2.3. 


Table 1. Public state and local education job openings, hires, and separations: 2019, 2020, and 2021

Table showing public state and local education job openings, hires, and separations (layoffs and discharges, other separations, and quits)in 2019, 2020, and 2021

---Not available.
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 15, 2022.


Net Change in Employment

JOLTS data show the relationships of hires and separations throughout business cycles. Net employment changes result from the relationship between hires and separations. When the number of hires exceeds the number of separations, employment rises—even if the number of hires is steady or declining. Conversely, when the number of hires is less than the number of separations, employment declines—even if the number of hires is steady or rising. During the 2021 calendar year, hires for state and local education totaled 2,075,000. The number of separations was estimated at 1,622,000 (including 1,009,000 quits). Taken together, the public state and local education sector in 2021 experienced a net employment gain of 453,000. In contrast, there was a net employment loss of 787,000 in 2020, resulting from 1,647,000 hires and 2,434,000 separations. These totals include workers who may have been hired and separated more than once during the year. Annual net gains and losses indicate the importance of being able to consider multiple years of data when studying the overall staffing situation in our education system. The net employment gain in 2021 does not erase the larger net loss experienced in 2020.

Figure 1 shows the cyclical nature of state and local government education employee job openings, hires, and separations. The percentages in the figure reflect the number of job openings, hires, and separations during the month relative to the total employment in the state and local government education industry. In general, separations and hiring are higher in the summer and lower in the winter. Both trends reflect the school fiscal year (July through June).


Figure 1. Monthly percentage of job openings, hires, and separations for the state and local government education industry: January 2019 to December 2021

Line graph showing monthly percentage of job openings, hires, and separations for the state and local government education industry in January, June, and December 2019, June and December 2020, and June and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 15, 2022.


Public State and Local Education Job Openings

Figure 2 shows the job openings in December 2021 compared with those in December 2019 and December 2020 across different industries. Overall, the total nonfarm job opening rate was 6.4 percent in December 2021, which was an increase of 2.3 percentage points over the rate in December 2020. The percentage of public state and local education sector jobs with openings was 2.9 percent (320,000) in December 2021, which was higher than the 2.0 percent (220,000) in December 2019 or 1.9 percent (194,000) in December 2020. The percentage of public state and local education sector job openings in December 2021 was not measurably different from the percentage in November 2021.


Figure 2. Rate of job openings, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of job openings, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Public State and Local Education Hires

Figure 3 shows hires across major industries as a percentage of total employment. Overall, the total nonfarm hire rate was 3.2 percent in December 2021, which was 0.3 percentage points higher than the rate in December 2020. The percentage of public state and local education sector hires was 0.9 percent (91,000) in December 2021, which was not measurably different from the number or rate in either December 2019 or December 2020. The percentage of public state and local education sector hires in December 2021 was lower than the 1.2 percent in November 2021 (133,000).

The gaps between hires and job openings in the public education sector were larger in December 2021 than in 2019 or 2020, due to a larger number of openings in December 2021. In December 2021, the gap between the rates of job openings and hires in education was 2.0 percentage points, compared with 1.2 in both December 2019 and December 2020.


Figure 3. Rate of hires, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of hires, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Public State and Local Education Total Separations

Total separations include quits, layoffs and discharges, and other separations. Quits are generally voluntary separations initiated by the employee. Therefore, the quit rate can serve as a measure of workers’ willingness or ability to leave jobs. Layoffs and discharges are involuntary separations initiated by the employer. The other separations category includes separations due to retirement, death, disability, and transfers to other locations of the same firm.

Total separations for the public state and local education industry were 126,000, or 1.2 percent, in December 2021 (figure 4). Quits accounted for 59 percent of all separations for state and local education employees in December 2021. The quit rate was 0.7 percent for December 2021, which was about 0.2 percentage points higher than in December 2020, but not measurably different from the rate in December 2019. Quit rates for public state and local education employees were consistently lower than for private sector employees.8 For example, in December 2021 the total private sector quit rate was 2.8 percent.


Figure 4. Rate of total separations, by major industry: December 2019, December 2020, and December 2021

Scatter plot showing rate of total separations, by major industry, in December 2019, December 2020, and December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2019, 2020, and 2021, based on data downloaded March 17, 2022.


Taken together, the data show that in recent years there generally have been fewer separations in the public education industry compared with other industries. The December 2021 separation rate for state and local education employees of 1.2 percent was higher than the November 2021 separation rate of 0.9 percent. Nevertheless, the separation rate for the state and local education industry was lower than for all other industries in December 2021.

At 2.9 percent, state and local education had the lowest percentage of jobs with openings in December 2021. However, that does not mean that staffing shortages were not a factor in the state and local education industry (figure 5). The ratio of job openings to hires for state and local education (3.5) in December 2021 is well above the average for all industries (2.1), indicating a high demand for employees in this industry and relative difficulty of filling available slots. The only industries with higher openings-to-hires ratios were the federal government (3.9) and state and local government, excluding education (5.6). Thus, while the openings-to-hires ratio was relatively higher for the state and local education industry, it was lower than the ratio for the federal government and for state and local government, excluding education.


Figure 5. Ratio of job openings to hires, by major industry: December 2021

Horizontal bar chart showing ratio of job openings to hires, by major industry, in December 2021

NOTE: Data are not seasonally adjusted.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2021, based on data downloaded March 17, 2022.


To understand the cumulative status of the employment situation at the end of the school year, we intend to provide an update of our analyses as these data become available.

Learn more about JOLTS and access additional data on job openings, hires, and separations. Be sure to follow NCES on Twitter, Facebook, LinkedIn, and YouTube to stay informed.

 

By Josue DeLaRosa, NCES


[1] For a discussion on the reliability of the estimates, please see Job Openings and Labor Turnover Technical Note - 2022 M01 Results (bls.gov).

[2] U.S. Department of Labor, Bureau of Labor Statistics, “How Did Employment Change During the COVID-19 pandemic? Evidence From a New BLS Survey Supplement,” downloaded March 18, 2022, from https://www.bls.gov/opub/btn/volume-11/how-did-employment-change-during-the-covid-19-pandemic.htm; and “As the COVID-19 Pandemic Affects the Nation, Hires and Turnover Reach Record Highs in 2020,” downloaded March 18, 2022, from https://www.bls.gov/opub/mlr/2021/article/as-the-covid-19-pandemic-affects-the-nation-hires-and-turnover-reach-record-highs-in-2020.htm.

[3] JOLTS refers to this industry as state and local government education and uses ID 92.

[4] JOLTS does not collect occupation data.

[5] U.S. Department of Education, National Center for Education Statistics, Digest of Education Statistics, table 213.10, downloaded March 30, 2022, from https://nces.ed.gov/programs/digest/d21/tables/dt21_213.10.asp?current=yes, and table 314.10, downloaded March 30, 2022, from https://nces.ed.gov/programs/digest/d20/tables/dt20_314.10.asp?current=yes.

[6] Separations include all separations from the payroll during the entire reference month and are reported by type of separation: quits, layoffs and discharges, and other separations.

[7] Engel, M. (2012). The Timing of Teacher Hires and Teacher Qualifications: Is There an Association? Teachers College Record, 114(12): 1–29.

[8] The private sector includes all nonfarm employees except federal employment and state and local government employment.                                                           

Public Charter School Expenditures by School Level

How do we achieve the best education results for the best price? This is a central question among researchers and policymakers alike. In this blog post, we share outcomes from school year 2017–18 concerning public charter school spending at the elementary, middle, and high school levels to help inform the discussion on charter school costs and benefits to the broader education system.

The first modern charter law in the United States was passed in Minnesota in 1991. Since that time, the number of charter schools has grown tremendously as an option in public elementary and secondary education. In 2017–18, the United States had 7,086 public charter schools in 44 states and the District of Columbia. In a decade, from 2007–08 to 2017–18, the number of public charter schools in the United States increased more than 70 percent, representing a little more than 7 percent of all public schools at the end of this time period (figure 1).


Figure 1. Number of public charter schools in the United States: School years 2007–08 through 2017–18

Line graph showing the number of public charter schools in the United States for school years 2007–08 through 2017–18

NOTE: These data include counts of operational public elementary/secondary charter schools for the 50 states and the District of Columbia.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), 2007–08 through 2017–18.


Nearly half (47 percent) of all public charter schools in the United States are classified as elementary schools, 11 percent are classified as middle schools, and 28 percent are classified as high schools (figure 2). The remainder (14 percent) have other grade-level configurations and do not fall into any of these categories.


Figure 2. Percentage of public charter schools in the United States, by school level: School year 2017–18

Pie chart showing percentage of public charter schools in the United States, by school level (elementary, middle, high, and other) for school year 2017–18

NOTE: These data reflect operational public elementary/secondary charter schools for the 50 states and the District of Columbia from the Common Core of Data (CCD) for 2017. School-level categories are taken from the Documentation to NCES’ Common Core of Data for school year 2017–18, whereby “Elementary” includes schools with students enrolled in grades K–4 that offer more elementary grades than middle grades; “Middle” includes schools with students enrolled in grades 5–8 that offer more middle grades than elementary or secondary grades; “High” includes schools with students enrolled in grade 12 and other secondary grades that offer more high grades than middle grades; and “Other” includes schools with both elementary and high grades or grades at all three levels (elementary, middle, and high). Excludes 2,360 schools categorized in the CCD as adult education, not applicable, not reported, prekindergarten-only, secondary, and ungraded.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Civil Rights Data Collection (CRDC), 2017–18.


According to expenditure data captured in the Civil Rights Data Collection (CRDC), public schools in the United States spent $330.94 billion in 2017–18, or more than $6,600 per pupil. Reports of national school expenditures based on data from the CRDC are significantly lower than those estimated using the National Public Education Financial Survey (NPEFS) from the Common Core of Data (CCD). This could be attributed to data on spending for school nutrition, operations and maintenance, and transportation being captured in the NPEFS but not collected in the CRDC. However, the CRDC data allow for comparisons of public charter and noncharter schools at the school level. In 2017–18, spending among public noncharter schools fell just under the national average of $6,500 per pupil. Like other schools in the U.S. public school system, charter schools do not charge tuition and instead receive district and state funding based on their enrollment. Public charter schools spent more than $26.83 billion in 2017–18, or just more than $8,900 per pupil, thus exceeding the national average.

The per pupil school expenditures of public charter schools across school levels1 are different from those of public noncharter schools. This analysis compares spending between public elementary, middle, and high schools in 2017–18. (Mixed-level and other schools are excluded because they have variable grade levels and other characteristics that can make expenditures incomparable across school types.) Across school levels, per pupil expenditures among public charter schools exceeded the national average, while per pupil expenditures among public noncharter schools were closer to the national average. Specifically, for public charter schools, per pupil expenditures were highest for elementary schools ($8,400), followed by high schools ($8,200) and middle schools ($8,100) (figure 3). However, for public noncharter schools, per pupil expenditures were highest for high schools ($6,600), followed by elementary schools ($6,400) and middle schools ($6,100).


Figure 3. Per pupil public school expenditures, by public charter school status and school level: School year 2017–18  

Horizontal stacked bar chart showing per pupil public school expenditures, by public charter school status and school level, for school year 2017–18

 

NOTE: Rounded to nearest multiple of 100. Analytical universe restricted to charter schools in both the CRDC and CCD that could be linked or matched using unique identification numbers.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Civil Rights Data Collection (CRDC), 2017–18.


The CRDC splits school expenditures into personnel or staff expenditures (e.g., salaries of teachers and of instructional, support, and administrative staff) and nonpersonnel expenditures (e.g., the cost of books, computers, instructional supplies, and professional development for teachers). (Nonpersonnel expenditures do not include those for school nutrition, operations, maintenance, or transportation to and from school.) Figures 4 and 5 show that across school levels in 2017–18, both public charter and noncharter schools tended to spend more per pupil on salaries and less per pupil on nonpersonnel expenditures. The differences between public charter and noncharter schools are particularly noticeable in comparisons of nonpersonnel expenditures, where charter schools spent considerably more per pupil than noncharter schools, most prominently at the elementary school level ($3,400 vs. $800). The figures also show that among public charter schools, middle schools had higher salary expenditures but lower nonpersonnel expenditures than did elementary or high schools. These findings demonstrate the importance of considering school level when examining public charter school spending.


Figure 4. Per pupil public school salary expenditures, by public charter school status and school level: School year 2017–18

Horizontal stacked bar chart showing per pupil public school salary expenditures, by public charter school status and school level, for school year 2017–18

NOTE: Rounded to nearest multiple of 100. Analytical universe restricted to charter schools in both the CRDC and CCD that could be linked or matched using unique identification numbers.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Civil Rights Data Collection (CRDC), 2017–18.


Figure 5. Per pupil public school nonpersonnel expenditures, by public charter school status and school level: School year 2017–18

Horizontal stacked bar chart showing per pupil public school nonpersonnel expenditures, by public charter school status and school level, for school year 2017–18

NOTE: Rounded to nearest multiple of 100. Analytical universe restricted to charter schools in both the CRDC and CCD that could be linked or matched using unique identification numbers.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Civil Rights Data Collection (CRDC), 2017–18.


Thoughts for Future Research

Since 2009, the CRDC—a mandatory data collection—has collected school expenditure data from elementary and secondary public schools and school districts. The 2017–18 findings suggest that public charter schools spent nearly 200 to 300 percent more on nonpersonnel expenditures per pupil than did public noncharter schools. However, there are concerns about districts’ ability to accurately report school expenditure data, including those for public charter schools. While the CRDC is currently the only complete national database of school-level spending, the CCD has partial school-level fiscal data for about 30 states, and NCES is making an effort to increase this voluntary reporting. Future studies could include a more targeted analysis of spending among public charter schools by geographic settings, student enrollee characteristics, school size, and school type.

Civil Rights Data Collection

Since 1968, the U.S. Department of Education has collected data on key education and civil rights issues in our nation’s public schools. The CRDC collects a variety of information, including data on student enrollment and educational programs and services, most of which is disaggregated by race/ethnicity, sex, limited English proficiency, and disability. The CRDC informs the Office of Civil Rights’ overall strategy for administering and enforcing the civil rights statutes for which it is responsible. The CRDC collects data only from public schools (i.e., no data are collected from private schools). The CRDC data files can be found here: https://ocrdata.ed.gov/.

 

By Jennifer Hudson, Ph.D., and Jennifer Sable (AIR) and Christopher D. Hill, Ph.D. (NCES)


[1] For the purposes of this blog post, school-level categories are taken from the Documentation to NCES’ Common Core of Data for SY 2017–18:  “Elementary” includes schools with students enrolled in grades K through 4 that offer more elementary grades than middle grades. “Middle” includes schools with students enrolled in grades 5 through 8 that offer more middle grades than elementary or secondary grades. “High” include schools with students enrolled in grade 12 and other secondary grades that offer more high grades than middle grades.  “Other” includes schools with both elementary and high grades or grades at all three levels (elementary, middle, and high).

Summer Learning During the COVID-19 Pandemic

As the school year comes to a close, many families are considering opportunities to continue learning over the summer months. Summer learning has often been seen as a way to supplement instruction during the regular school year. The U.S. Department of Education’s “COVID-19 Handbook” notes that summer learning “can offer another opportunity to accelerate learning, especially for those students most impacted by disruptions to learning during the school year.” Data from the Household Pulse Survey (HPS), which NCES developed in partnership with the U.S. Census Bureau and other federal statistical agencies, explores access to summer learning opportunities by school type, racial/ethnic group, household educational attainment level, and income level.

The HPS1 provides data on how people’s lives have been impacted by the coronavirus (COVID-19) pandemic. Phase 3.2 of the HPS introduced questions on the summer education activities of children enrolled in public or private school or homeschooled, following the end of the normal school year in spring 2021. Adults 18 years old and over who had children under 18 in the home enrolled in school were asked if any of the children had attended a traditional summer school program because of poor grades; attended a summer school program to help catch up with lost learning time during the pandemic; attended school-led summer camps for subjects like math, science, or reading; and/or worked with private tutors to help catch up with lost learning time during the pandemic. Adults were allowed to select all categories that applied. Data from Phase 3.2 of the HPS, covering September 15 to 27, 2021, are discussed in this blog post.

Among adults with children enrolled in public or private school or homeschooled, 26 percent reported children were enrolled in any summer education activities after the end of the normal school year in spring of 2021 (figure 1). The most reported summer education activity was attending a summer school program to catch up on lost learning time during the pandemic (10 percent). Eight percent reported children attended school-led summer camps for subjects like math, science, or reading and 7 percent each reported children attended a traditional summer school program because of poor grades or worked with private tutors to catch up with lost learning time during the pandemic.


Figure 1. Among adults 18 years old and over who had children under age 18 in the home enrolled in school, percentage reporting participation in summer education activities after the end of the normal school year in spring of 2021, by type of summer activity: September 15 to 27, 2021

Bar chart showing percentage of adults 18 years old and over who had children under age 18 in the home enrolled in school reporting participation in summer education activities after the end of the normal school year in Spring of 2021, by type of summer activity, from the September 15 to 27, 2021, phase of the Household Pulse Survey

1 Does not equal the total of the subcategories because respondents could report multiple types of summer education activities.
NOTE: Data in this figure are considered experimental and do not meet NCES standards for response rates. The 2021 Household Pulse Survey, an experimental data product, is an Interagency Federal Statistical Rapid Response Survey to Measure Household Experiences during the coronavirus pandemic, conducted by the U.S. Census Bureau in partnership with 16 other federal agencies and offices. The number of respondents and response rate for the period reported in this table were 59,833 and 5.6 percent. The final weights are designed to produce estimates for the total persons age 18 and older living within housing units. These weights were created by adjusting the household level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. For more information, see https://www.census.gov/programs-surveys/household-pulse-survey/technical-documentation.html. Although rounded numbers are displayed, the figures are based on unrounded data.  
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, September 15 to 27, 2021. See Digest of Education Statistics 2021, table 227.60.


There were no significant differences in the overall percentage of adults reporting any summer education activities for their children by school type (public school, private school, or homeschooled). However, there were differences in the most common type of summer education activity reported for those with children in public school versus private school. Among adults with children in public school, the most reported summer activity was attending a summer school program to catch up with lost learning during the pandemic (11 percent) (figure 2). Among adults with children in private school, higher percentages reported children attended school-led summer camps for subjects like math, science, or reading or worked with private tutors to catch up with lost learning time during the pandemic (11 percent, each), compared with the percentage who reported children attended a traditional summer school program because of poor grades (3 percent). There were no significant differences among adults with homeschooled children by type of summer education activity.


Figure 2. Among adults 18 years old and over who had children under age 18 in the home enrolled in school, percentage reporting participation in summer education activities after the end of the normal school year in spring of 2021, by control of school and type of summer activity: September 15 to 27, 2021

Bar chart showing percentage of adults 18 years old and over who had children under age 18 in the home enrolled in school reporting participation in summer education activities after the end of the normal school year in Spring of 2021, by control of school and type of summer activity, from the September 15 to 27, 2021, phase of the Household Pulse Survey

NOTE: Figure excludes percentage of adults reporting any summer education activities for their children or that their children did not participate in any summer activities. Data in this figure are considered experimental and do not meet NCES standards for response rates. The 2021 Household Pulse Survey, an experimental data product, is an Interagency Federal Statistical Rapid Response Survey to Measure Household Experiences during the coronavirus pandemic, conducted by the U.S. Census Bureau in partnership with 16 other federal agencies and offices. The number of respondents and response rate for the period reported in this table were 59,833 and 5.6 percent. The final weights are designed to produce estimates for the total persons age 18 and older living within housing units. These weights were created by adjusting the household level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. For more information, see https://www.census.gov/programs-surveys/household-pulse-survey/technical-documentation.html. Although rounded numbers are displayed, the figures are based on unrounded data.
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, September 15 to 27, 2021. See Digest of Education Statistics 2021, table 227.60.


Children’s participation in any summer education activities in the summer of 2021 varied across racial/ethnic groups. The percentage of adults reporting any summer activities for their children was higher for Black adults (44 percent) than for all other racial/ethnic groups (figure 3). While lower than the percentage of Black adults reporting any summer activities for their children, the percentages of Asian and Hispanic adults (33 and 32 percent, respectively) were both higher than the percentage of White adults (20 percent).


Figure 3. Among adults 18 years old and over who had children under age 18 in the home enrolled in school, percentage reporting participation in any summer education activities after the end of the normal school year in spring of 2021, by adult’s race/ethnicity: September 15 to 27, 2021

Bar chart showing percentage of adults 18 years old and over who had children under age 18 in the home enrolled in school reporting participation in summer education activities after the end of the normal school year in Spring of 2021, by adult’s race/ethnicity, from the September 15 to 27, 2021, phase of the Household Pulse Survey

1 Includes persons reporting Pacific Islander alone, persons reporting American Indian/Alaska Native alone, and persons of Two or more races.
NOTE: Data in this figure are considered experimental and do not meet NCES standards for response rates. The 2021 Household Pulse Survey, an experimental data product, is an Interagency Federal Statistical Rapid Response Survey to Measure Household Experiences during the coronavirus pandemic, conducted by the U.S. Census Bureau in partnership with 16 other federal agencies and offices. The number of respondents and response rate for the period reported in this table were 59,833 and 5.6 percent. The final weights are designed to produce estimates for the total persons age 18 and older living within housing units. These weights were created by adjusting the household level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. For more information, see https://www.census.gov/programs-surveys/household-pulse-survey/technical-documentation.html. Race categories exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, September 15 to 27, 2021. See Digest of Education Statistics 2021, table 227.60.            


There were also some differences observed in reported participation rates in summer education activities by the responding adult’s highest level of educational attainment. Children in households where the responding adult had completed less than high school were more likely to participate in summer education activities (39 percent) than were those in households where the responding adult had completed some college or an associate’s degree (25 percent), a bachelor’s degree (22 percent), or a graduate degree (25 percent) (figure 4). Similarly, children in households where the responding adult had completed high school2 were more likely to participate in summer education activities (28 percent) than were those in households where the responding adult had completed a bachelor’s degree (22 percent). There were no significant differences in children’s participation rates between other adult educational attainment levels.


Figure 4. Among adults 18 years old and over who had children under age 18 in the home enrolled in school, percentage reporting participation in any summer education activities after the end of the normal school year in spring of 2021, by adult’s highest level of educational attainment: September 15 to 27, 2021

Bar chart showing percentage of adults 18 years old and over who had children under age 18 in the home enrolled in school reporting participation in summer education activities after the end of the normal school year in Spring of 2021, by adult’s highest level of educational attainment, from the September 15 to 27, 2021, phase of the Household Pulse Survey

1 High school completers include those with a high school diploma as well as those with an alternative credential, such as a GED.
NOTE: Data in this figure are considered experimental and do not meet NCES standards for response rates. The 2021 Household Pulse Survey, an experimental data product, is an Interagency Federal Statistical Rapid Response Survey to Measure Household Experiences during the coronavirus pandemic, conducted by the U.S. Census Bureau in partnership with 16 other federal agencies and offices. The number of respondents and response rate for the period reported in this table were 59,833 and 5.6 percent. The final weights are designed to produce estimates for the total persons age 18 and older living within housing units. These weights were created by adjusting the household level sampling base weights by various factors to account for nonresponse, adults per household, and coverage. For more information, see https://www.census.gov/programs-surveys/household-pulse-survey/technical-documentation.html. Although rounded numbers are displayed, the figures are based on unrounded data.  
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, September 15 to 27, 2021. See Digest of Education Statistics 2021, table 227.60


The percentage of adults reporting that children participated in summer education activities also varied across households with different levels of income in 2020. The percentages of adults reporting that children participated in any summer education activities were higher for those with a 2020 household income of less than $25,000 (34 percent) and $25,000 to $49,999 (33 percent) than for all other higher household income levels. There were no significant differences in reported participation rates among adults with 2020 household income levels of $50,000 to $74,999, $75,000 to $99,999, $100,000 to $149,999, and $150,000 or more.

Learn more about the Household Pulse Survey and access data tables, public use files, and an interactive data tool. For more detailed data on the summer education activities discussed in this blog post, explore the Digest of Education Statistics, table 227.60. To access other data on how the COVID-19 pandemic has impacted education, explore our School Pulse Panel dashboard.

Be sure to follow us on Twitter, Facebook, LinkedIn, and YouTube to stay up-to-date on the latest findings and trends in education, including those on summer learning activities.

 

By Ashley Roberts, AIR


[1] The speed of the survey development and the pace of the data collection efforts led to policies and procedures for the experimental HPS that were not always consistent with traditional federal survey operations. For example, the timeline for the surveys meant that opportunities to follow up with nonrespondents were very limited. This has led to response rates of 1 to 10 percent, which are much lower than the typical target response rate set in most federal surveys. While the responses have been statistically adjusted so that they represent the nation and states in terms of geographic distribution, sex, race/ethnicity, age, and educational attainment, the impact of survey bias has not been fully explored.

[2] High school completers include those with a high school diploma as well as those with an alternative credential, such as a GED.

Measuring “Traditional” and “Non-Traditional” Student Success in IPEDS: Data Insights from the IPEDS Outcome Measures (OM) Survey Component

This blog post is the second in a series highlighting the Integrated Postsecondary Education Data System (IPEDS) Outcome Measures (OM) survey component. The first post introduced a new resource page that helps data reporters and users understand OM and how it compares to the Graduation Rates (GR) and Graduation Rates 200% (GR200) survey components. Using data from the OM survey component, this post provides key findings about the demographics and college outcomes of undergraduates in the United States and is designed to spark further study of student success using OM data.

What do Outcome Measures cohorts look like?

OM collects student outcomes for all entering degree/certificate-seeking undergraduates, including non-first-time (i.e., transfer-in) and part-time students. Students are separated into eight subcohorts by entering status (i.e., first-time or non-first-time), attendance status (i.e., full-time or part-time), and Pell Grant recipient status.1 Figure 1 shows the number and percentage distribution of degree/certificate-seeking undergraduates in each OM subcohort from 2009–10 to 2012–13, by institutional level.2

Key takeaways:

  • Across all cohort years, the majority of students were not first-time, full-time (FTFT) students, a group typically referred to as “traditional” college students. At 2-year institutions, 36 percent of Pell Grant recipients and 16 percent of non-Pell Grant recipients were FTFT in 2012–13. At 4-year institutions, 43 percent of Pell Grant recipients and 44 percent of non-Pell Grant recipients were FTFT in 2012–13.
  • Pell Grant recipient cohorts have become less “traditional” over time. In 2012–13, some 36 percent of Pell Grant recipients at 2-year institutions were FTFT, down 5 percentage points from 2009–10 (41 percent). At 4-year institutions, 43 percent of Pell Grant recipients were FTFT in 2012–13, down 5 percentage points from 2009–10 (48 percent).

Figure 1. Number and percentage distribution of degree/certificate-seeking undergraduate students in the adjusted cohort, by Pell Grant recipient status, institutional level, and entering and attendance status: 2009–10 to 2012–13 adjusted cohorts

Stacked bar chart showing the number and percentage distribution of degree/certificate-seeking undergraduate students by Pell Grant recipient status (recipients and non-recipients), institutional level (2-year and 4-year), and entering and attendance status (first-time/full-time, first-time/part-time, non-first-time/full-time, and non-first-time/part-time) for 2009–10 to 2012–13 adjusted cohorts

NOTE: This figure presents data collected from Title IV degree-granting institutions in the United States. Percentages may not sum to 100 due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS) Outcome Measures component final data (2017­–19) and provisional data (2020).


What outcomes does Outcome Measures collect?

The OM survey component collects students’ highest credential earned (i.e., certificate, associate’s, or bachelor’s) at 4,3 6, and 8 years after entry. Additionally, for students who did not earn a credential by the 8-year status point, the survey component collects an enrollment status outcome (i.e., still enrolled at the institution, enrolled at another institution, or enrollment status unknown). Figure 2 shows these outcomes for the 2012–13 adjusted cohort.

Key takeaways:

  • The percentage of students earning an award (i.e., certificate, associate’s, or bachelor’s) was higher at each status point, with the greatest change occurring between the 4- and 6-year status points (a 7-percentage point change, from 32 percent to 39 percent).
  • At the 8-year status point, more than a quarter of students were still enrolled in higher education: 26 percent had “transferred-out” to enroll at another institution and 1 percent were still enrolled at their original institution. This enrollment status outcome fills an important gap left by the GR200 survey component, which does not collect information on students who do not earn an award 8 years after entry.

Figure 2. Number and percentage distribution of degree/certificate-seeking undergraduate students, by award and enrollment status and entry status point: 2012–13 adjusted cohort

Waffle chart showing award status (certificate, associate’s, bachelor’s, and did not receive award) and enrollment status (still enrolled at institution, enrolled at another institution, and enrollment status unknown) of degree/certificate-seeking undergraduate students, by status point (4-year, 6-year, and 8-year) for 2012–13 adjusted cohort

NOTE: One square represents 1 percent. This figure presents data collected from Title IV degree-granting institutions in the United States.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS) Outcome Measures component provisional data (2020).


How do Outcome Measures outcomes vary across student subgroups?

Every data element collected by the OM survey component (e.g., cohort counts, outcomes by time after entry) can be broken down into eight subcohorts based on entering, attendance, and Pell Grant recipient statuses. In addition to these student characteristics, data users can also segment these data by key institutional characteristics such as sector, Carnegie Classification, special mission (e.g., Historically Black College or University), and region, among others.4 Figure 3 displays the status of degree/certificate-seeking undergraduates 8 years after entry by each student subcohort within the broader 2012–13 degree/certificate-seeking cohort.

Key takeaways:

  • Of the eight OM subcohorts, FTFT non-Pell Grant recipients had the highest rate of earning an award or still being enrolled 8 years after entry. Among this subcohort, 18 percent had an unknown enrollment status 8 years after entry.
  • Among both Pell Grant recipients and non-Pell Grant recipients, full-time students had a higher rate than did part-time students of earning an award or still being enrolled 8 years after entry.
  • First-time, part-time (FTPT) students had the lowest rate of the subcohorts of earning a bachelor’s degree. One percent of FTPT Pell Grant recipients and 2 percent of FTPT non-Pell Grant recipients had earned a bachelor’s degree by the 8-year status point.

Figure 3. Number and percentage distribution of degree/certificate-seeking undergraduate students 8 years after entry, by Pell Grant Recipient status, entering and attendance status, and award and enrollment status: 2012–13 adjusted cohort

Horizontal stacked bar chart showing award (certificate, associate’s, and bachelor’s) and enrollment statuses (still enrolled at institution, enrolled at another institution, and enrollment status unknown) of degree/certificate-seeking undergraduate students by Pell Grant recipient status (recipients and non-recipients), institutional level (2-year and 4-year), and entering and attendance status (first-time/full-time, first-time/part-time, non-first-time/full-time, and non-first-time/part-time) for 2012–13 adjusted cohort

 

NOTE: This figure presents data collected from Title IV degree-granting institutions in the United States. Percentages may not sum to 100 due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS) Outcome Measures component provisional data (2020).


How do Outcome Measures outcomes vary over time?

OM data are comparable across 4 cohort years.5 Figure 4 shows outcomes of degree/certificate-seeking undergraduates 8 years after entry from the 2009–10 cohort through the 2012–13 cohort for so-called “traditional” (i.e., FTFT) and “non-traditional” (i.e., non-FTFT) students.

Key takeaways:

  • For both traditional and non-traditional students, the percentage of students earning an award was higher for the 2012–13 cohort than for the 2009–10 cohort, climbing from 47 percent to 51 percent for traditional students and from 32 percent to 35 percent for non-traditional students.
  • The growth in award attainment for traditional students was driven by the share of students earning bachelor’s degrees (30 percent for the 2009–10 cohort vs. 35 percent for the 2012–13 cohort).
  • The growth in award attainment for non-traditional students was driven by the share of students earning both associate’s degrees (15 percent for the 2009–10 cohort vs. 16 percent for the 2012–13 cohort) and bachelor’s degrees (13 percent for the 2009–10 cohort vs. 15 percent for the 2012–13 cohort).

Figure 4. Number and percentage distribution of degree/certificate-seeking undergraduate students 8 years after entry, by first-time, full-time (FTFT) status and award and enrollment status: 2009–10 to 2012–13 adjusted cohorts

Stacked bar chart showing award status (certificate, associate’s, and bachelor’s) and enrollment status (still enrolled at institution, enrolled at another institution, and enrollment status unknown) of degree/certificate-seeking undergraduate students 8 years after entry by first-time, full-time status (traditional or first-time, full-time students and non-traditional or non-first-time, full-time students) for 2009–10 to 2012–13 adjusted cohorts

NOTE: This figure presents data collected from Title IV degree-granting institutions in the United States. “Non-traditional” (i.e., non-first-time, full-time) students include first-time, part-time, non-first-time, full-time, and non-first-time, part-time subcohorts. Percentages may not sum to 100 due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS) Outcome Measures component final data (2017–2019) and provisional data (2020).


To learn more about the IPEDS OM survey component, visit the Measuring Student Success in IPEDS: Graduation Rates (GR), Graduation Rates 200% (GR200), and Outcome Measures (OM) resource page and the OM survey component webpage. Go to the IPEDS Use the Data page to explore IPEDS data through easy-to-use web tools, access data files to conduct your own analyses like those presented in this blog post, or view OM web tables.  

By McCall Pitcher, AIR


[1] The Federal Pell Grant Program (Higher Education Act of 1965, Title IV, Part A, Subpart I, as amended) provides grant assistance to eligible undergraduate postsecondary students with demonstrated financial need to help meet education expenses.

[2] Due to the 8-year measurement lag between initial cohort enrollment and student outcome reporting for the Outcome Measures survey component, the most recent cohort for which data are publicly available is 2012–13. Prior to the 2009–10 cohort, OM did not collect cohort subgroups by Pell Grant recipient status. Therefore, this analysis includes data only for the four most recent cohorts.

[3] The 4-year status point was added in the 2017–18 collection.

[4] Data users can explore available institutional variables on the IPEDS Use the Data webpage.

[5] For comparability purposes, this analysis relies on data from the 2017–18 collection (reflecting the 2009–10 adjusted cohort) through the 2020–21 collection (reflecting the 2012–13 adjusted cohort). Prior to the 2017–18 collection, OM cohorts were based on a fall term for academic reporters and a full year for program reporters.