IES Blog

Institute of Education Sciences

New Report Highlights Progress and Challenges in U.S. High School Dropout and Completion Rates

A new NCES report has some good news about overall high school dropout and completion rates, but it also highlights some areas of concern.

Using a broad range of data, the recently released Trends in High School Dropout and Completion Rates in the United States report shows that the educational attainment of young adults has risen in recent decades. The public high school graduation rate is up, and the status dropout rate (the percentage of 16- to 24-year-olds who are not enrolled in school and have not completed high school) is down. Despite these encouraging trends, there are significant disparities in educational attainment among young adults in the United States. The report shines new light on these disparities by analyzing detailed data from the U.S. Census Bureau.

For large population groups, the report provides status dropout rates calculated using annual data from the American Community Survey (ACS), administered by the U.S. Census Bureau. For example, in 2017, some 5.4 percent of 16- to 24-year-olds who were not enrolled in high school lacked a high school diploma or equivalent credential.

For smaller population groups, there are not enough ACS respondents during any given year to allow for precise and reliable estimates of the high school status dropout rate. For these demographic subgroups, NCES pools the data from 5 years of the ACS in order to obtain enough respondents to accurately describe patterns in the dropout rate.

For example, while the overall status dropout rate for Asian 16- to 24-year-olds was below the national average in 2017, the rates for specific subgroups of Asian young adults varied widely. Based on 5 years of ACS data, high school status dropout rates among Asian 16- to 24-year-olds ranged from 1.1 percent for individuals of Korean descent to 23.2 percent for individuals of Burmese descent. These rates represent the “average” status dropout rate for the period from 2013 to 2017. They offer greater precision than the 1-year estimates, but the 5-year time span might make them difficult to interpret at first glance. 

 


Figure 1. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by selected Asian subgroups: 2013–2017

‡ Reporting standards not met. Either there are too few cases for a reliable estimate or the coefficient of variation (CV) is 50 percent or greater.
If the estimation procedure were repeated many times, 95 percent of the calculated confidence intervals would contain the true status dropout rate for the population group.
NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations. Asian subgroups exclude persons of Hispanic ethnicity.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017.


 

The 5-year ACS data can also be used to describe status dropout rates for smaller geographic areas with more precision than the annual ACS data. For example, the average 2013–2017 status dropout rates ranged from 3.8 percent in Massachusetts to 9.6 percent in Louisiana. The 5-year ACS data allowed us to calculate more accurate status dropout rates for each state and, in many cases, for racial/ethnic subgroups within the state. Access the complete state-level dropout rates by race/ethnicity here.
 


Figure 2. Percentage of high school dropouts among persons 16 through 24 years old (status dropout rate), by state: 2013–2017

NOTE: “Status” dropouts are 16- to 24-year-olds who are not enrolled in school and who have not completed a high school program, regardless of when they left school. People who received an alternative credential such as a GED are counted as high school completers. This figure presents 5-year average status dropout rates for the period from 2013 to 2017. Use of a 5-year average increases the sample size, thereby reducing the sampling error and producing more stable estimates. Data are based on sample surveys of the entire population of 16- to 24-year-olds residing within the United States, including both noninstitutionalized persons (e.g., those living in households, college housing, or military housing located within the United States) and institutionalized persons (e.g., those living in prisons, nursing facilities, or other healthcare facilities). Estimates may differ from those based on the Current Population Survey (CPS) because of differences in survey design and target populations.
SOURCE: U.S. Department of Commerce, Census Bureau, American Community Survey (ACS), 2013–2017. See table 2.3.


 

For more information about high school dropout and completion rates, check out the recently released Trends in High School Dropout and Completion Rates in the United States report. For more information about the 5-year ACS datasets, visit https://www.census.gov/programs-surveys/acs/guidance/estimates.html.

 

By Joel McFarland

NCES’s Top Hits of 2019

As 2019 comes to an end, we’re taking stock of NCES’s most downloaded reports, most viewed indicators, Fast Facts, and blog posts, and most engaging tweets over the past year. As you reflect on 2019 and kick off 2020, we encourage you to take a few minutes to explore the wide range of education data NCES produces.

 

Top Five Reports, by PDF downloads

1. Condition of Education 2019 (8,526)

2. Condition of Education 2018 (5,789)

3Status and Trends in the Education of Racial and Ethnic Groups 2018 (4,743)

4. Student Reports of Bullying: Results From the 2015 School Crime Supplement to the National Crime Victimization Survey (4,587)

5. Digest of Education Statistics 2017 (4,554)

 

Top Five indicators from the Condition of Education, by number of web sessions

1. Children and Youth With Disabilities (86,084)

2. Public High School Graduation Rates (68,977)

3. Undergraduate Enrollment (58,494)

4. English Language Learners in Public Schools (50,789)

5. Education Expenditures by Country (43,474)

 

Top Five Fast Facts, by number of web sessions

1. Back to School Statistics (227,510)

2. College Graduate Rates (109,617)

3. Tuition Costs of Colleges and Universities (107,895)

4. College Endowments (71,056)

5. High School Dropout Rates (67,408)

 

Top Five Blog Posts, by number of web sessions

1. Free or Reduced Price Lunch: A Proxy for Poverty? (5,522)

2. Explore Data on Mental Health Services in K–12 Public Schools for Mental Health Awareness Month (4,311)

3. Educational Attainment Differences by Students’ Socioeconomic Status (3,903)

4. Education and Training Opportunities in America’s Prisons (3,877)

5. Measuring Student Safety: Bullying Rates at School (3,706)

 

Top Five Tweets, by number of impressions

1. Condition of Education (45,408 impressions)

 

2. School Choice in the United States (44,097 impressions)

 

3. NAEP Music and Visual Arts Assessment (32,440 impressions)

 

4. International Education Week (29,997 impressions)

 

5. Pop Quiz (25,188 impressions)

 

Be sure to check our blog site and the NCES website in 2020 to keep up-to-date with NCES’s latest activities and releases. You can also follow NCES on Twitter, Facebook, and LinkedIn for daily updates and content.

 

By Thomas Snyder

Higher Rates of Homeschooled Students Than Enrolled Students Participated in Family Learning Activities in 2016

About 3 percent of the school-age population—around 1.7 million students—was homeschooled in 2016. We know that homeschooled students have different educational experiences than students who are enrolled in public or private schools, and recently released data explore some of those differences.

The Parent and Family Involvement in Education survey of the National Household Education Surveys Program (NHES) provides information on homeschooled and public and private school students based on a nationally representative sample. Parents provide information about their children’s formal education and learning activities outside of school.

The survey asks about six broad types of family learning activities that students experienced in the month prior to the survey. The 2016 results indicate that homeschooled students were more likely than their peers enrolled in public or private schools to participate in five of these six activities.

In 2016, higher percentages of homeschooled students than of students enrolled in public or private schools visited a library; a bookstore; an art gallery, museum, or historical site; and a zoo or aquarium in the month prior to completion of the survey (figure 1). A higher percentage of homeschooled students also attended an event sponsored by a community, religious, or ethnic group with their parents in the month prior to completion of the survey. The one activity for which there was no measurable difference between homeschooled and students enrolled in public or private schools was going to a play, concert, or other live show.

 


Figure 1. Percentage of 5- to 17-year-old students participating in selected family learning activities in the past month, by homeschool and enrollment status: 2016

 

NOTE: Includes 5- to 17-year-old students in grades or grade equivalents of kindergarten through grade 12. Homeschooled students are school-age children who receive instruction at home instead of at a public or private school either all or most of the time. Excludes students who were enrolled in public or private school more than 25 hours per week and students who were homeschooled only because of temporary illness. Selected activities with the child may have included any member of the household.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Parent and Family Involvement in Education Survey of the National Household Education Surveys Program (PFI-NHES), 2016.


 

The NHES data do not tell us why these differences exist, but parents’ availability of time and parenting style may be a factor. However, more research is needed to understand these differences.

A recent report, Homeschooling in the United States: Results from the 2012 and 2016 Parent and Family Involvement Survey (PFI-NHES:2012 and 2016), provides the full complement of data from the NHES about homeschoolers’ experiences in 2016. In addition to family learning activities, the report provides information about the following:
 

  • Homeschooler demographics

  • Reasons for homeschooling

  • Providers of homeschool instruction

  • Amount of time homeschoolers spent attending public schools, private schools, or college

  • Participation in local homeschool group activities

  • Homeschool teaching styles

  • Sources of homeschool curriculum and books

  • Online coursetaking of homeschool students

  • Homeschool subject areas

  • Parent expectations of homeschooled students’ future education
     

For more information on the National Household Education Surveys Program, please go to https://nces.ed.gov/nhes/.

 

By Sarah Grady

New International Comparisons of Reading, Mathematics, and Science Literacy Assessments

The Program for International Student Assessment (PISA) is a study of 15-year-old students’ performance in reading, mathematics, and science literacy that is conducted every 3 years. The PISA 2018 results provide us with a global view of U.S. students’ performance compared with their peers in nearly 80 countries and education systems. In PISA 2018, the major domain was reading literacy, although mathematics and science literacy were also assessed.

In 2018, the U.S. average score of 15-year-olds in reading literacy (505) was higher than the average score of the Organization for Economic Cooperation and Development (OECD) countries (487). Compared with the 76 other education systems with PISA 2018 reading literacy data, including both OECD and non-OECD countries, the U.S. average reading literacy score was lower than in 8 education systems, higher than in 57 education systems, and not measurably different in 11 education systems. The U.S. percentage of top performers in reading was larger than in 63 education systems, smaller than in 2 education systems, and not measurably different in 11 education systems. The average reading literacy score in 2018 (505) was not measurably different from the average score in 2000 (504), the first year PISA was administered. Among the 36 education systems that participated in both years, 10 education systems reported higher average reading literacy scores in 2018 compared with 2000, and 11 education systems reported lower scores.

The U.S. average score of 15-year-olds in mathematics literacy in 2018 (478) was lower than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 mathematics literacy data, the U.S. average mathematics literacy score was lower than in 30 education systems, higher than in 39 education systems, and not measurably different in 8 education systems. The average mathematics literacy score in 2018 (478) was not measurably different from the average score in 2003 (483), the earliest year with comparable data. Among the 36 education systems that participated in both years, 10 systems reported higher mathematics literacy scores in 2018 compared with 2003, 13 education systems reported lower scores, and 13 education systems reported no measurable changes in scores.  

The U.S. average score of 15-year-olds in science literacy (502) was higher than the OECD average score (489). Compared with the 77 other education systems with PISA 2018 science literacy data, the U.S. average science literacy score was lower than in 11 education systems, higher than in 55 education systems, and not measurably different in 11 education systems. The average science literacy score in 2018 (502) was higher than the average score in 2006 (489), the earliest year with comparable data. Among the 52 education systems that participated in both years, 7 education systems reported higher average science literacy scores in 2018 compared with 2006, 22 education systems reported lower scores, and 23 education systems reported no measurable changes in scores.

PISA is conducted in the United States by NCES and is coordinated by OECD, an intergovernmental organization of industrialized countries. Further information about PISA can be found in the technical notes, questionnaires, list of participating OECD and non-OECD countries, released assessment items, and FAQs.

 

By Thomas Snyder

From Data Collection to Data Release: What Happens?

In today’s world, much scientific data is collected automatically from sensors and processed by computers in real time to produce instant analytic results. People grow accustomed to instant data and expect to get things quickly.

At the National Center for Education Statistics (NCES), we are frequently asked why, in a world of instant data, it takes so long to produce and publish data from surveys. Although improvements in the timeliness of federal data releases have been made, there are fundamental differences in the nature of data compiled by automated systems and specific data requested from federal survey respondents. Federal statistical surveys are designed to capture policy-related and research data from a range of targeted respondents across the country, who may not always be willing participants.

This blog is designed to provide a brief overview of the survey data processing framework, but it’s important to understand that the survey design phase is, in itself, a highly complex and technical process. In contrast to a management information system, in which an organization has complete control over data production processes, federal education surveys are designed to represent the entire country and require coordination with other federal, state, and local agencies. After the necessary coordination activities have been concluded, and the response periods for surveys have ended, much work remains to be done before the survey data can be released.

Survey Response

One of the first sources of potential delays is that some jurisdictions or individuals are unable to fill in their surveys on time. Unlike opinion polls and online quizzes, which use anyone who feels like responding to the survey (convenience samples), NCES surveys use rigorously formulated samples meant to properly represent specific populations, such as states or the nation as a whole. In order to ensure proper representation within the sample, NCES follows up with nonresponding sampled individuals, education institutions, school districts, and states to ensure the maximum possible survey participation within the sample. Some large jurisdictions, such as the New York City school district, also have their own extensive survey operations to conclude before they can provide information to NCES. Before the New York City school district, which is larger than about two-thirds of all state education systems, can respond to NCES surveys, it must first gather information from all its schools. Receipt of data from New York City and other large districts is essential to compiling nationally representative data.

Editing and Quality Reviews

Waiting for final survey responses does not mean that survey processing comes to a halt. One of the most important roles NCES plays in survey operations is editing and conducting quality reviews of incoming data, which take place on an ongoing basis. In these quality reviews, a variety of strategies are used to make cost-effective and time-sensitive edits to the incoming data. For example, in the Integrated Postsecondary Education Data System (IPEDS), individual higher education institutions upload their survey responses and receive real-time feedback on responses that are out of range compared to prior submissions or instances where survey responses do not align in a logical way. All NCES surveys use similar logic checks in addition to a range of other editing checks that are appropriate to the specific survey. These checks typically look for responses that are out of range for a certain type of respondent.

Although most checks are automated, some particularly complicated or large responses may require individual review. For IPEDS, the real-time feedback described above is followed by quality review checks that are done after collection of the full dataset. This can result in individualized follow up and review with institutions whose data still raise substantive questions. 

Sample Weighting

In order to lessen the burden on the public and reduce costs, NCES collects data from selected samples of the population rather than taking a full census of the entire population for every study. In all sample surveys, a range of additional analytic tasks must be completed before data can be released. One of the more complicated tasks is constructing weights based on the original sample design and survey responses so that the collected data can properly represent the nation and/or states, depending on the survey. These sample weights are designed so that analyses can be conducted across a range of demographic or geographic characteristics and properly reflect the experiences of individuals with those characteristics in the population.

If the survey response rate is too low, a “survey bias analysis” must be completed to ensure that the results will be sufficiently reliable for public use. For longitudinal surveys, such as the Early Childhood Longitudinal Study, multiple sets of weights must be constructed so that researchers using the data will be able to appropriately account for respondents who answered some but not all of the survey waves.

NCES surveys also include “constructed variables” to facilitate more convenient and systematic use of the survey data. Examples of constructed variables include socioeconomic status or family type. Other types of survey data also require special analytic considerations before they can be released. Student assessment data, such as the National Assessment of Educational Progress (NAEP), require that a number of highly complex processes be completed to ensure proper estimations for the various populations being represented in the results. For example, just the standardized scoring of multiple choice and open-ended items can take thousands of hours of design and analysis work.

Privacy Protection

Release of data by NCES carries a legal requirement to protect the privacy of our nation’s children. Each NCES public-use dataset undergoes a thorough evaluation to ensure that it cannot be used to identify responses of individuals, whether they are students, parents, teachers, or principals. The datasets must be protected through item suppression, statistical swapping, or other techniques to ensure that multiple datasets cannot be combined in such a way as to identify any individual. This is a time-consuming process, but it is incredibly important to protect the privacy of respondents.

Data and Report Release

When the final data have been received and edited, the necessary variables have been constructed, and the privacy protections have been implemented, there is still more that must be done to release the data. The data must be put in appropriate formats with the necessary documentation for data users. NCES reports with basic analyses or tabulations of the data must be prepared. These products are independently reviewed within the NCES Chief Statistician’s office.

Depending on the nature of the report, the Institute of Education Sciences Standards and Review Office may conduct an additional review. After all internal reviews have been conducted, revisions have been made, and the final survey products have been approved, the U.S. Secretary of Education’s office is notified 2 weeks in advance of the pending release. During this notification period, appropriate press release materials and social media announcements are finalized.

Although NCES can expedite some product releases, the work of preparing survey data for release often takes a year or more. NCES strives to maintain a balance between timeliness and providing the reliable high-quality information that is expected of a federal statistical agency while also protecting the privacy of our respondents.  

 

By Thomas Snyder