IES Blog

Institute of Education Sciences

National Spending for Public Schools Increases for the Sixth Consecutive Year in School Year 2018–19

NCES just released a finance tables report, Revenues and Expenditures for Public Elementary and Secondary Education: FY19 (NCES 2021-302), which draws from data in the National Public Education Financial Survey (NPEFS). The results show that spending1 on elementary and secondary education increased in school year 2018–19 (fiscal year [FY] 2019), after adjusting for inflation. This is the sixth consecutive year that year-over-year education spending increased since 2012–13. This increase follows declines in year-over-year spending for the prior 4 years (2009–10 to 2012–13).

Current expenditures per pupil2 for the day-to-day operation of public elementary and secondary schools rose to $13,187 in FY19, an increase of 2.1 percent from FY18, after adjusting for inflation (figure 1).3 Current expenditures per pupil also increased over the previous year in FY18 (by 0.9 percent), in FY17 (by 1.7 percent), in FY16 (by 2.8 percent), in FY15 (by 2.7 percent), and in FY14 (by 1.2 percent). In FY19, education spending was 11.8 percent higher than the lowest point of the Great Recession in FY13 and 6.1 percent higher than spending prior to the Great Recession in FY10.


Figure 1. National inflation-adjusted current expenditures per pupil for public elementary and secondary school districts: FY10 through FY19

NOTE: Spending is reported in constant FY19 dollars, based on the Consumer Price Index (CPI).
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), "National Public Education Financial Survey," fiscal years 2010 through 2018 Final Version 2a; and fiscal year 2019, Provisional Version 1a; and Digest of Education Statistics 2019, retrieved January 8, 2021, from https://nces.ed.gov/programs/digest/d19/tables/dt19_106.70.asp.


Without adjusting for geographic cost differences, current expenditures per pupil ranged from $7,950 in Utah to $24,882 in New York (figure 2). In addition to New York, current expenditures per pupil were highest in the District of Columbia ($22,831), New Jersey ($21,331), Vermont ($21,217), and Connecticut ($21,140). In addition to Utah, current expenditures per pupil were lowest in Idaho ($8,043), Arizona ($8,773), Nevada ($9,126), and Oklahoma ($9,203).


Figure 2. Current expenditures per pupil for public elementary and secondary education, by state: FY19

NOTE: These data are not adjusted for geographic cost differences.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “National Public Education Financial Survey (NPEFS),” FY19, Provisional Version 1a and “State Nonfiscal Survey of Public Elementary/Secondary Education,” school year 2018–19, Provisional Version 1a.


These new NPEFS data offer researchers extensive opportunities to investigate state and national patterns of revenues and expenditures. Explore the report and learn more.


[1] Spending refers to current expenditures. Current expenditures comprise expenditures for the day-to-day operation of schools and school districts for public elementary/secondary education, including expenditures for staff salaries and benefits, supplies, and purchased services. Current expenditures include instruction, instruction-related support services (e.g., social work, health, psychological services), and other elementary/secondary current expenditures but exclude expenditures on capital outlay, other programs, and interest on long-term debt.
[2] Per pupil expenditures are calculated using student membership derived from the State Nonfiscal Survey of Public Elementary/Secondary Education. In some states, adjustments are made to ensure consistency between membership and reported fiscal data. More information on these adjustments can be found in the data file documentation at https://nces.ed.gov/ccd/files.asp.
[3] In order to compare spending from one year to the next, expenditures are converted to constant dollars, which adjusts figures for inflation. Inflation adjustments utilize the Consumer Price Index (CPI) published by the U.S. Department of Labor, Bureau of Labor Statistics. For comparability to fiscal education data, NCES adjusts the CPI from a calendar year to a school fiscal year basis (July through June). See Digest of Education Statistics 2019, table 106.70, retrieved January 8, 2021, from https://nces.ed.gov/programs/digest/d19/tables/dt19_106.70.asp.

 

By Stephen Q. Cornman NCES; Lei Zhou, Activate Research; and Malia Howell, U.S. Census Bureau

Students’ Access to the Internet and Digital Devices at Home

This blog continues a robust discussion about National Center for Education Statistics (NCES) data collected in the recent past that can illuminate the issue of students’ access to the internet and digital devices at home. A few years ago—well before the coronavirus pandemic and stay-at-home orders shone a bright light on the inequities across the nation—NCES began dedicating resources to improve its data collection and policymaking around education technology and equity at the district, state, and national levels.

The 2019 National Assessment of Educational Progress (NAEP) reading questionnaire asked 4th- and 8th-grade students if they had internet access at home and if there was a computer or tablet at home that they could use (referred to in this blog as having “digital access”). These data provide a pre–coronavirus pandemic snapshot of students’ digital access. Across all public schools, 81 percent of 4th-grade students and 88 percent of 8th-grade students said that they had digital access (figures 1 and 2). Thus, 19 percent of 4th-grade students and 12 percent of 8th-grade students in public schools may not have either access to the internet or the devices required to carry out distance learning.  


Figure 1. Percentage of 4th-grade public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by state: 2019

* Significantly different from the National Public estimate at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


Figure 2. Percentage of 8th-grade public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by state: 2019

* Significantly different from the National Public estimate at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


There were also differences across states in 2019. For 4th-grade students, the percentages who had digital access varied by state, ranging from 70 percent in New Mexico to 88 percent in New Jersey (table 1). Arizona, Arkansas, Idaho, Kansas, Mississippi, Missouri, New Mexico, Oklahoma, Oregon, Tennessee, Texas, and Wyoming had lower percentages of students who had digital access than the national average (figure 1 and table 1). For 8th-grade students, the percentages who had access ranged from 81 percent in Oklahoma to 93 percent in Connecticut (table 1). Alabama, Arizona, Arkansas, Hawaii, Kentucky, Louisiana, Mississippi, Nevada, Oklahoma, Tennessee, Texas, and West Virginia had lower percentages of students who had access than the national average (figure 2 and table 1).


Table 1. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and state: 2019

 

Grade 4

 

Grade 8

 

State

Percent

s.e

 

Percent

s.e

 

   National public

81

(0.2)

 

88

(0.2)

 

Alabama

79

(1.2)

 

86

(0.8)

Alaska

 

 

Arizona

78

(0.9)

84

(0.9)

Arkansas

73

(0.9)

83

(1.1)

California

81

(0.9)

 

88

(0.9)

 

Colorado

 

 

Connecticut

85

(0.8)

93

(0.6)

Delaware

81

(0.9)

 

90

(0.6)

 

District of Columbia

83

(0.8)

90

(0.6)

DoDEA

88

(0.7)

96

(0.4)

Florida

85

(0.7)

89

(0.7)

 

Georgia

83

(0.9)

90

(0.7)

Hawaii

79

(1)

 

86

(0.8)

Idaho

77

(0.9)

88

(0.8)

 

Illinois

83

(0.8)

90

(0.6)

Indiana

80

(0.9)

 

90

(1.1)

 

Iowa

81

(0.9)

 

90

(0.7)

 

Kansas

78

(0.9)

88

(0.7)

 

Kentucky

81

(0.8)

 

87

(0.7)

Louisiana

79

(1)

 

85

(0.9)

Maine

82

(0.9)

 

89

(0.7)

 

Maryland

82

(0.8)

 

91

(0.6)

Massachusetts

87

(0.8)

93

(0.7)

Michigan

80

(1)

 

90

(0.8)

 

Minnesota

83

(1)

92

(0.7)

Mississippi

77

(1.2)

84

(0.7)

Missouri

78

(0.8)

89

(0.8)

 

Montana

 

 

Nebraska

81

(0.9)

 

90

(0.7)

Nevada

79

(1)

 

85

(0.7)

New Hampshire

 

 

New Jersey

88

(0.8)

93

(0.6)

New Mexico

70

(1.2)

82

(0.8)

New York

84

(0.7)

91

(0.7)

North Carolina

81

(0.8)

 

89

(0.8)

 

North Dakota

81

(1)

 

90

(0.7)

Ohio

82

(0.9)

 

91

(0.7)

Oklahoma

73

(1.1)

81

(0.9)

Oregon

77

(1)

87

(0.8)

 

Pennsylvania

85

(0.8)

91

(0.7)

Rhode Island

84

(0.8)

90

(0.6)

South Carolina

81

(1)

 

90

(0.9)

 

South Dakota

 

 

Tennessee

77

(0.9)

86

(0.9)

Texas

75

(0.9)

82

(1)

Utah

 

 

Vermont

81

(0.9)

 

91

(0.7)

Virginia

82

(0.8)

 

91

(0.8)

Washington

80

(1)

 

89

(0.8)

 

West Virginia

81

(1)

 

86

(0.7)

Wisconsin

83

(0.9)

 

91

(0.7)

Wyoming

78

(0.9)

88

(0.7)

 

↑ Significantly higher than the estimate for National Public at the .05 level of statistical significance.
↓ Significantly higher than the estimate for National Public at the .05 level of statistical significance.
‡ Reporting standards not met. Sample size insufficient to permit a reliable estimate.
† Not applicable.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant. “National public” refers to the results for all students in public schools.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


Looking at the results of NAEP’s 2019 Trial Urban Districts Assessment (TUDA), Miami-Dade, Florida, had the highest percentages of 4th- and 8th-grade students who had digital access (88 percent and 93 percent, respectively) (table 2). Fresno, California, had the lowest percentage of 4th-grade students (67 percent) who had access and Dallas, Texas, had the lowest percentage of 8th-grade students (73 percent) who had access.


Table 2. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and Trial Urban District Assessments (TUDA): 2019

 

Grade 4

 

Grade 8

 

Large city

Percentage

 

Percentage

 

   All large cities

78

 

85

 

Albuquerque

75

 

85

 

Atlanta

82

86

 

Austin

78

 

83

 

Baltimore City

73

84

 

Boston

81

89

Charlotte

83

91

Chicago

80

 

88

 

Clark County (NV)

78

 

84

 

Cleveland

74

80

Dallas

71

73

Denver

 

 

Detroit

70

79

District of Columbia (DCPS)

83

90

Duval County (FL)

84

89

Fort Worth (TX)

72

88

Fresno

67

77

Guilford County (NC)

78

 

85

 

Hillsborough County (FL)

81

 

87

 

Houston

71

75

Jefferson County (KY)

82

88

Los Angeles

76

 

85

 

Miami-Dade

88

93

Milwaukee

75

 

85

 

New York City

81

 

89

Philadelphia

78

 

86

 

San Diego

81

 

90

Shelby County (TN)

78

 

86

 

Significantly higher than the estimate for Large City at the .05 level of statistical significance.
↓ Significantly lower than the estimate for Large City at the .05 level of statistical significance.
‡ Reporting standards not met. Sample size insufficient to permit a reliable estimate.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


In 2019, higher percentages of 8th-grade students than of 4th-grade students had digital access. This pattern was consistent across all states and TUDA jurisdictions. On average, in both 4th and 8th grades, higher percentages of students in suburban areas than of students in cities, towns, and rural areas had access (table 3).


Table 3. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and locale: 2019

 

Grade 4

 

Grade 8

 

Locale

Percentage

s.e

 

Percentage

s.e

 

   National public

81

(0.2)

 

88

(0.2)

 

City

79

(0.4)

86

(0.4)

Suburban

84

(0.3)

 

92

(0.3)

 

Town

77

(0.8)

86

(0.6)

Rural

78

(0.4)

87

(0.4)

↓ Significantly lower than the estimate for Suburban at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


While the NAEP data reveal state-level patterns in students’ digital access before the pandemic, the Household Pulse Survey (HPS) provides insight into the digital access of students across the country during the pandemic. The HPS is conducted by the Census Bureau and seven other federal statistical agency partners, including NCES. Since April 23, 2020, the HPS has provided weekly or biweekly estimates of the availability of computers and internet access to children for educational purposes.

In April 2020, 88 percent of adults who had children under 18 in the home enrolled in school reported that computers were always or usually available for educational purposes. By the end of March 2021, that percentage increased to 94 percent (table 4).

A similar pattern emerged in the HPS data for internet access. In April 2020, 91 percent of adults who had children under 18 in the home enrolled in school reported that the internet was always or usually available for educational purposes. In March 2021, that percentage had increased to 94 percent (table 4).


Table 4. Percentage of adults who had children under 18 in the home enrolled in school who reported that computers and internet access were always or usually available for educational purposes: 2020–21, selected time periods

 

Computers available

Access to internet

 

Percentage

s.e.

 

Percentage

s.e.

 

April 23 to May 5, 2020

88

(0.5)

 

91

(0.4)

 

March 17 to March 29, 2021

94

(0.4)

94

(0.4)

↑ Significantly higher than the estimate for April 23 to May 5, 2020, at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, selected periods, April 2021 through March 2021.


While these data provide a recent look into the technology landscape for students both before and during the pandemic, there is still a need to collect more and better data to understand digital inequities. For example, future NCES surveys could ask schools, students, and teachers about their technology use and access at home, what resources for learning and instruction they have at home, and the environment in which many students and teachers now find themselves learning and teaching.

 

Resources for more information:

 

By Cadelle Hemphill, AIR; Yan Wang, AIR: Diana Forster, AIR; Chad Scott, AIR; and Grady Wilburn, NCES

The Growing Reading Gap: IES Event to Link Knowledge to Action Through Literacy Data

On June 8 and 9, the Institute of Education Sciences (IES) and the Council of the Great City Schools (CGCS) will host a Reading Summit to address one of the most important issues confronting American education today: the declining reading performance of America’s lowest-performing students and the growing gap between low- and high-performing students.

At this 2-day virtual event, participants will explore the results of the National Assessment of Educational Progress (NAEP), as well as other IES data, and learn strategies to help educators and low-performing readers make progress.

Learn more about the summit’s agenda and speakers—including IES Director Mark Schneider, NCES Commissioner James L. Woodworth, and NCES Associate Commissioner Peggy Carr—and register to participate (registration is free).

In the meantime, explore some of the data NCES collects on K–12 literacy and reading achievement, which show that the scores of students in the lowest-performing groups are decreasing over time.

  • The National Assessment of Educational Progress (NAEP) administers reading assessments to 4th-, 8th-, and 12th-grade students. The most recent results from 2019 show that average reading scores for students in the 10th percentile (i.e., the lowest-performing students) decreased between 2017 and 2019 at grade 4 (from 171 to 168) and grade 8 (from 219 to 213) and decreased between 2019 and 2015 at grade 12 (from 233 to 228).
  • The Progress in International Reading Literacy Study (PIRLS) is an international comparative assessment that measures 4th-grade students’ reading knowledge and skills. The most recent findings from 2016 show that the overall U.S. average score (549) was higher than the PIRLS scale centerpoint (500), but at the 25th percentile, U.S. 4th-graders scored lower in 2016 (501) than in 2011 (510).
  • The Program for International Student Assessment (PISA) is a study of 15-year-old students’ performance in several subjects, including reading literacy. The 2018 results show that, although the overall U.S. average reading score (505) was higher than the OECD average score (487), at the 10th percentile, the U.S. average score in 2018 (361) was not measurably different from the score in 2015 and was lower than the score in 2012 (378).

NCES also collects data on young children’s literacy knowledge and activities as well as the literacy competencies of adults. Here are a few data collections and tools for you to explore:

This year, the Condition of Education includes a newly updated indicator on literacy activities that parents reported doing with young children at home. Here are some key findings from this indicator, which features data from the 2019 NHES Early Childhood Program Participation Survey:

In the week before the parents were surveyed,

  • 85 percent of 3- to 5-year-olds were read to by a family member three or more times.
  • 87 percent of 3- to 5-year-olds were told a story by a family member at least once.
  • 96 percent of 3- to 5-year-olds were taught letters, words, or numbers by a family member at least once.

In the month before the parents were surveyed,

  • 37 percent of 3- to 5-year-olds visited a library with a family member at least once.

Be sure to read the full indicator in the 2021 Condition of Education, which was released in May, for more data on young children’s literacy activities, including analyses by race/ethnicity, mother’s educational attainment, and family income.

Don’t forget to follow NCES on Twitter, Facebook, and LinkedIn to stay up-to-date on the latest findings and trends in literacy and reading data and register for the IES Reading Summit to learn more about this topic from experts in the field. 

 

By Megan Barnett, AIR

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

Announcing the Condition of Education 2021 Release

NCES is pleased to present the 2021 edition of the Condition of Education, an annual report mandated by the U.S. Congress that summarizes the latest data on education in the United States. This report uses data from across the center and from other sources and is designed to help policymakers and the public monitor educational progress.

Beginning in 2021, individual indicators can be accessed online on the newly redesigned Condition of Education Indicator System website. A synthesis of key findings from these indicators can be found in the Report on the Condition of Education, a more user-friendly PDF report.

A total of 86 indicators are included in this year’s Condition of Education, 55 of which were updated this year. As in prior years, these indicators present a range of topics from prekindergarten through postsecondary education, as well as labor force outcomes and international comparisons. Additionally, this year’s 55 updated indicators include 17 indicators on school crime and safety.

For the 2021 edition of the Condition of Education, most data were collected prior to 2020, either during the 2018–19 academic year or in fall 2019. Therefore, with some exceptions, this year’s report presents findings from prior to the coronavirus pandemic.

At the elementary and secondary level (prekindergarten through grade 12), the data show that 50.7 million students were enrolled in public schools fall 2018, the most recent year for which data were available at the time this report was written. Public charter school enrollment accounted for 7 percent (3.3 million students) of these public school enrollments, more than doubling from 3 percent (1.6 million students) in 2009. In 2019, U.S. 4th- and 8th-grade students scored above the scale centerpoint (500 out of 1000) on both the math and science assessments in the Trends in International Mathematics and Science Study (TIMSS).

In 2020, 95 percent of 25- to 29-year-olds had at least a high school diploma or equivalent, while 39 percent had a bachelor’s or higher degree. These levels of educational attainment are associated with economic outcomes, such as employment and earnings. For example, among those working full time, year round, annual median earnings in 2019 were 59 percent higher for 25- to 34-year-olds with a bachelor’s or higher degree than for those with a high school diploma or equivalent.

In addition to regularly updated annual indicators, this year’s two spotlight indicators highlight early findings on the educational impact of the coronavirus pandemic from the Household Pulse Survey (HPS).

  • The first spotlight examines distance learning at the elementary and secondary level at the beginning of the 2020–21 academic year. Overall, among adults with children under 18 in the home enrolled in school, two-thirds reported in September 2020 that classes had been moved to a distance learning format using online resources. In order to participate in these remote learning settings, students must have access to computers and the internet. More than 90 percent of adults with children in their household reported that one or both of these resources were always or usually available to children for educational purposes in September 2020. At the same time, 59 percent of adults reported that computers were provided by the child’s school or district, while 4 percent reported that internet access was paid for by the child’s school or district. Although higher percentages of lower income adults reported such assistance, this did not eliminate inequalities in access to these resources by household income.
  • The second spotlight examines changes in postsecondary education plans for fall 2020 in response to the coronavirus pandemic. Among adults 18 years old and over who had household members planning to take classes in fall 2020 from a postsecondary institution, 45 percent reported that the classes at least one household member planned would be in different formats in the fall (e.g., formats would change from in-person to online), 31 percent reported that all plans to take classes in the fall had been canceled for at least one household member, and 12 percent reported that at least one household member would take fewer classes in the fall. Some 28 percent reported no change in fall plans to take postsecondary classes for at least one household member. The two most frequently cited reasons for the cancellation of plans were having the coronavirus or having concerns about getting the coronavirus (46 percent), followed by not being able to pay for classes/educational expenses because of changes to income from the pandemic (42 percent).

The Condition of Education also includes an At a Glance section, a Reader’s Guide, a Glossary, and a Guide to Sources, all of which provide additional background information. Each indicator includes references to the source data tables used to produce the indicator.

As new data are released throughout the year, indicators will be updated and made available online.

In addition to publishing the Condition of Education, NCES produces a wide range of other reports and datasets designed to help inform policymakers and the public about significant trends and topics in education. More information about the latest activities and releases at NCES may be found on our website or by following us on Twitter, Facebook, and LinkedIn.

 

By James L. Woodworth, NCES Commissioner