IES Blog

Institute of Education Sciences

Measuring Student Safety: New Data on Bullying Rates at School

Bullying remains a serious issue for students and their families, as well as policy makers, administrators, and educators. NCES is committed to providing reliable and timely data on bullying to measure the extent of the problem and track any progress toward reducing its prevalence. As such, a new set of web tables focusing on bullying rates at school was just released. These tables use data from the School Crime Supplement to the National Crime Victimization Survey, which collects data on bullying by asking a nationally representative sample of students ages 12–18 if they had been bullied at school. This blog post highlights data from these newly released web tables.

In 2019, about 22 percent of students reported being bullied at school during the school year (figure 1). This percentage was lower compared with a decade ago (2009), when 28 percent of students reported being bullied at school.

Students’ reports of being bullied varied based on student and school characteristics in 2019. For instance, a higher percentage of female students than of male students reported being bullied at school during the school year (25 vs. 19 percent). The percentage of students who reported being bullied at school was higher for students of Two or more races (37 percent) than for White students (25 percent) and Black students (22 percent), which were in turn higher than the percentage of Asian students (13 percent). Higher percentages of 6th-, 7th-, and 8th-graders reported being bullied at school (ranging from 27 to 28 percent), compared with 9th-, 10th-, and 12th-graders (ranging from 16 to 19 percent). A higher percentage of students enrolled in schools in rural areas (28 percent) than in schools in other locales (ranging from 21 to 22 percent) reported being bullied at school.


Figure 1. Percentage of students ages 12–18 who reported being bullied at school during the school year, by selected student and school characteristics: 2019

Horizontal bar chart showing the percentage of students ages 12–18 who reported being bullied at school during the school year in 2019, by selected student characteristics (sex, race/ethnicity, and grade) and school characteristics (locale and control of school)

1 Total includes race categories not separately shown.
2 Race categories exclude persons of Hispanic ethnicity. Data for Pacific Islander and American Indian/Alaska Native students did not meet reporting standards in 2019; therefore, data for these two groups are not shown.
3 Excludes students with missing information about the school characteristic.
NOTE: “At school” includes in the school building, on school property, on a school bus, and going to and from school. Although rounded numbers are displayed, the figures are based on unrounded data.
SOURCE: U.S. Department of Justice, Bureau of Justice Statistics, School Crime Supplement (SCS) to the National Crime Victimization Survey, 2019. See Digest of Education Statistics 2020, table 230.40.


Not all students chose to report the bullying to adults at school. Among students ages 12–18 who reported being bullied at school during the school year in 2019, about 46 percent reported notifying an adult at school about the incident. This percentage was higher for Black students than for White students (61 vs. 47 percent), and both percentages were higher than the percentage for Hispanic students (35 percent).

For more details on these data, see the web tables from “Student Reports of Bullying: Results from the 2019 School Crime Supplement to the National Crime Victimization Survey.” For additional information on this topic, see the Condition of Education indicator Bullying at School and Electronic Bullying. For indicators on other topics related to school crime and safety, select “School Crime and Safety” on the Explore by Indicator Topics page.

 

By Ke Wang, AIR

Rescaled Data Files for Analyses of Trends in Adult Skills

In January 2022, NCES released the rescaled data files for three adult literacy assessments conducted several decades earlier: the 1992 National Adult Literacy Survey (NALS), the 1994 International Adult Literacy Survey (IALS), and the 2003 Adult Literacy and Lifeskills Survey (ALL). By connecting the rescaled data from these assessments with data from the current adult literacy assessment, the Program for the International Assessment of Adult Competencies (PIAAC), researchers can examine trends on adult skills in the United States going back to 1992. This blog post traces the history of each of these adult literacy assessments, describes the files and explains what “rescaling” means, and discusses how these files can be used in analyses in conjunction with the PIAAC files. The last section of the post offers several example analyses of the data.

A Brief History of International and National Adult Literacy Assessments Conducted in the United States

The rescaled data files highlighted in this blog post update and combine historical data from national and international adult literacy studies that have been conducted in the United States.

NALS was conducted in 1992 by NCES and assessed U.S. adults in households, as well as adults in prisons. IALS—developed by Statistics Canada and ETS in collaboration with 22 participating countries, including the United States—assessed adults in households and was administered in three waves between 1994 and 1998. ALL was administered in 11 countries, including the United States, and assessed adults in two waves between 2003 and 2008.

PIAAC seeks to ensure continuity with these previous surveys, but it also expands on their quality assurance standards, extends the definitions of literacy and numeracy, and provides more information about adults with low levels of literacy by assessing reading component skills. It also, for the first time, includes a problem-solving domain to emphasize the skills used in digital (originally called “technology-rich”) environments.

How Do the Released Data Files From the Earlier Studies of Adult Skills Relate to PIACC?

All three of the released restricted-use data files (for NALS, IALS, and ALL) relate to PIAAC, the latest adult skills assessment, in different ways.

The NALS data file contains literacy estimates and background characteristics of U.S. adults in households and in prisons in 1992. It is comparable to the PIAAC data files for 2012/14 and 2017 through rescaling of the assessment scores and matching of the background variables to those of PIAAC.

The IALS and ALL data files contain literacy (IALS and ALL) and numeracy (ALL) estimates and background characteristics of U.S. adults in 1994 (IALS) and 2003 (ALL). Similar to NALS, they are comparable to the PIAAC restricted-use data (2012/14) through rescaling of the literacy and numeracy assessment scores and matching of the background variables to those of PIAAC. These estimates are also comparable to the international estimates of skills of adults in several other countries, including in Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand (see the recently released Data Point International Comparisons of Adult Literacy and Numeracy Skills Over Time). While the NCES datasets contain only the U.S. respondents, IALS and ALL are international studies, and the data from other participating countries can be requested from Statistics Canada (see the IALS Data Files/Publications and ALL Data pages for more detail). See the History of International and National Adult Literacy Assessments page for additional background on these studies. 

Table 1 provides an overview of the rescaled NALS, IALS, and ALL data files.


Table 1. Overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey (ALL) 

Table showing overview of the rescaled data files for the National Adult Literacy Survey (NALS), International Adult Literacy Survey (IALS), and Adult Literacy and Lifeskills Survey


What Does “Rescaled” Mean?

“Rescaling” the literacy (NALS, IALS, ALL) and numeracy (ALL) domains from these three previous studies means that the domains were put on the same scale as the PIAAC domains through the derivation of updated estimates of proficiency created using the same statistical models used to create the PIAAC skills proficiencies. Rescaling was possible because PIAAC administered a sufficient number of the same test questions used in NALS, IALS, and ALL.1 These rescaled proficiency estimates allow for trend analysis of adult skills across the time points provided by each study.

What Can These Different Files Be Used For?

While mixing the national and international trend lines isn’t recommended, both sets of files have their own distinct advantages and purposes for analysis.

National files

The rescaled NALS 1992 files can be used for national trend analyses with the PIAAC national trend points in 2012/2014 and 2017. Some potential analytic uses of the NALS trend files are to

  • Provide a picture of the skills of adults only in the United States;
  • Examine the skills of adults in prison and compare their skills with those of adults in households over time, given that NALS and PIAAC include prison studies conducted in 1992 and 2014, respectively;
  • Conduct analyses on subgroups of the population (such as those ages 16–24 or those with less than a high school education) because the larger sample size of NALS allows for more detailed breakdowns along with the U.S. PIAAC sample;
  • Focus on the subgroup of older adults (ages 66–74), given that NALS sampled adults over the age of 65, similar to PIAAC, which sampled adult ages 16–74; and
  • Analyze U.S.-specific background questions (such as those on race/ethnicity or health-related practices).

International files

The rescaled IALS 1994 and ALL 2003 files can be used for international trend analyses among six countries with the U.S. PIAAC international trend point in 2012/2014: Canada, Hungary, Italy, Norway, the Netherlands, and New Zealand. Some potential analytic uses of the IALS and ALL trend files are to

  • Compare literacy proficiency results internationally and over time using the results from IALS, ALL, and PIAAC; and
  • Compare numeracy proficiency results internationally and over time using the results from ALL and PIAAC.

Example Analyses Using the U.S. Trend Data on Adult Literacy

Below are examples of a national trend analysis and an international trend analysis conducted using the rescaled NALS, IALS, and ALL data in conjunction with the PIAAC data.

National trend estimates

The literacy scores of U.S. adults increased from 269 in NALS 1992 to 272 in PIAAC 2012/2014. However, the PIAAC 2017 score of 270 was not significantly different from the 1992 or 2012/2014 scores.


Figure 1. Literacy scores of U.S. adults (ages 16–65) along national trend line: Selected years, 1992–2017

Line graph showing literacy scores of U.S. adults (ages 16–65) along national trend line for NALS 1992, PIAAC 2012/2014, and PIAAC 2017

* Significantly different (p < .05) from NALS 1992 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Adult Literacy Survey (NALS), NALS 1992; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012–17.


International trend estimates

The literacy scores of U.S. adults decreased from 273 in IALS 1994 to 268 in ALL 2003 before increasing to 272 in PIAAC 2012/2014. However, the PIAAC 2012/2014 score was not significantly different from the IALS 1994 score.


Figure 2. Literacy scores of U.S. adults (ages 16–65) along international trend line: Selected years, 1994–2012/14

Line graph showing literacy scores of U.S. adults (ages 16–65) along international trend line for IALS 1994, ALL 2003, and PIAAC 2012/2014

* Significantly different (p < .05) from IALS 1994 estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Statistics Canada and Organization for Economic Cooperation and Development (OECD), International Adult Literacy Survey (IALS), 1994–98; Adult Literacy and Lifeskills Survey (ALL), 2003–08; and Program for the International Assessment of Adult Competencies (PIAAC), PIAAC 2012/14. See figure 1 in the International Comparisons of Adult Literacy and Numeracy Skills Over Time Data Point.


How to Access the Rescaled Data Files

More complex analyses can be conducted with the NALS, IALS, and ALL rescaled data files. These are restricted-use files and researchers must obtain a restricted-use license to access them. Further information about these files is available on the PIAAC Data Files page (see the “International Trend Data Files and Data Resources” and “National Trend Data Files and Data Resources” sections at the bottom of the page).

Additional resources:

By Emily Pawlowski, AIR, and Holly Xie, NCES


[1] In contrast, the 2003 National Assessment of Adult Literacy (NAAL), another assessment of adult literacy conducted in the United States, was not rescaled for trend analyses with PIAAC. For various reasons, including the lack of overlap between the NAAL and PIAAC literacy items, NAAL and PIAAC are thought to be the least comparable of the adult literacy assessments.

New Report on the Effects of the Coronavirus Pandemic on Undergraduate Student Experiences in Spring 2020

NCES recently released a report on the experiences of undergraduate students early in the COVID-19 pandemic. The report uses early release data from the 2019–20 National Postsecondary Student Aid Study (NPSAS:20) to describe how the pandemic disrupted students’ enrollment, housing, and finances in the spring of 2020. It also discusses how institutions helped students with these issues.

NPSAS:20 student surveys started in March 2020, and items about the COVID-19 pandemic were added in April 2020 to collect information about the effects of the pandemic on students’ educational experiences between January 1 and June 30, 2020. These early release data do not include students who answered before the pandemic questions were added. The data show that 87 percent of students had their enrollment disrupted or changed during this time. Students who experienced disruptions may have withdrawn or taken a leave of absence, had an extended school break, had changes made to their study-abroad program, or had classes cancelled or moved online.

Twenty-eight percent of students had a housing disruption or change, and 40 percent had a financial disruption or change. Students who had a housing disruption had to move or had difficulty finding safe and stable housing. Students who had a financial disruption may have lost a job or income or may have had difficulty getting food; they may have also received financial help from their postsecondary institutions.

The report also provides information about the experiences of students with different characteristics. For example, students with Pell Grants had a similar rate of enrollment disruption (87 percent) as those without them (88 percent). Those with Pell Grants had a lower rate of housing disruption (23 percent) than those without them (31 percent). However, they had a higher rate of financial disruption (48 percent) than their peers without Pell Grants (34 percent).


Figure 1. Percentage of undergraduates who experienced enrollment, housing, or financial disruptions or changes at their institution due to the COVID-19 pandemic, by Pell Grant recipient status: Spring 2020

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2019–20 National Postsecondary Student Aid Study (NPSAS:20, preliminary data).


The final NPSAS:20 raw data will be available in late 2022. Sign up for the IES Newsflash to receive announcements about NCES data products.

 

By Tracy Hunt-White

New Data Reveal Public School Enrollment Decreased 3 Percent in 2020–21 School Year

NCES recently released revised Common Core of Data (CCD) Preliminary Files, which are the product of the school year (SY) 2020–21 CCD data collection. CCD, the Department of Education’s primary database on public elementary and secondary education in the United States, provides comprehensive annual data on enrollment, school finances, and student graduation rates.

Here are a few key takeaways from the newly released data files:

Public school enrollment in SY 2020–21 was lower than it was in SY 2019–20.

Overall, the number of students enrolled in public schools decreased by 3 percent from SY 2019–20 to SY 2020–21. Note that Illinois did not submit data in time to be included in this preliminary report. The SY 2019–20 and SY 2020–21 total enrollment counts for California, Oregon, American Samoa, and the Bureau of Indian Education do not include prekindergarten counts.

The rate of decline in public school enrollment in SY 2020–21 was not consistent across all states.

Within states, the largest decreases were in Mississippi and Vermont (5 percent each), followed by Washington, New Mexico, Kentucky, New Hampshire, and Maine (each between 4 and 5 percent) (figure 1). Eighteen states had decreases of 3 percent or more; 29 states had decreases between 1 and 3 percent; and the District of Columbia, South Dakota, and Utah had changes of less than 1 percent.



Lower grade levels experienced a greater rate of decline in public school enrollment than did higher grade levels in SY 2020–21.

Public school enrollment decreased by 13 percent for prekindergarten and kindergarten and by 3 percent for grades 1–8. Public school enrollment increased by 0.4 percent for grades 9–12.

Most other jurisdictions experienced declines in public school enrollment in SY 2020–21.

Public school enrollment decreased in Puerto Rico (6 percent), Guam (5 percent), and American Samoa (2 percent). The Virgin Islands, however, experienced an increase of less than 1 percent.

To access the CCD preliminary data files and learn more about public school enrollment in SY 2020–21, visit the CCD data files webpage.

Online Training for the 2019 NHES Early Childhood Program Participation Survey Data and Parent and Family Involvement in Education Survey Data

The NCES National Household Education Survey (NHES) program administered two national surveys in 2019—the Early Childhood Program Participation (ECPP) survey and the Parent and Family Involvement in Education (PFI) survey. The ECPP survey collects information on young children’s care and education, including the use of home-based care with both relatives and nonrelatives and center-based care and education. The survey examines how well these care arrangements cover work hours, costs of care, location of care, the process of selecting care, and factors making it difficult to find care. The PFI survey collects information on a range of issues related to how families connect to schools, including information on family involvement with schools, school choice, homeschooling, virtual education, and homework practices.

NCES released data from the 2019 NHES administration on January 28, 2021. For each of the two surveys, this release includes the following:

  • Public-use data files, in ASCII, CSV, SAS, SPSS, Stata, and R
  • Restricted-use data files (in formats listed above and with codebook)
  • Public-Use Data File Codebook
  • Data File User’s Manual (for both public-use and restricted-use files)

That’s a lot of information! How should you use it? We suggest you start by viewing the NHES online data Distance Learning Dataset Training modules. The modules provide a high-level overview of the NHES program and the data it collects. They also include important considerations to ensure that your analysis takes into account the NHES’s complex sample design (such as applying weights and estimating standard errors).   

You should first view the five general NHES modules, which were developed for the 2012 NHES data. These modules are:

  • Introduction to the NHES
  • Getting Started with the NHES Data
  • Data Collected Through the NHES
  • NHES Sample Design, Weights, Variance, and Missing Data
  • Considerations for Analysis of NHES Data

A sixth module explains key changes in the 2019 ECPP and PFI surveys compared to their respective 2012 surveys:

  • Introduction to the 2019 NHES Data Collection

The sixth module also provides links to the 2019 ECPP and PFI data, restricted-use licensing information, and other helpful resources.

Now you are ready to go! If you have any questions, please contact us at NHES@ed.gov.

By Lisa Hudson, NCES