IES Blog

Institute of Education Sciences

New Data Show Growth in Online Bullying

A vast majority of middle and high school students have an online presence, resulting in heightened awareness and concern about cyberbullying. A new report from the National Center for Education Statistics (NCES) shows that reports of students being bullied online or by text are growing.

According to results from Student Reports of Bullying: Results From the 2017 School Crime Supplement to the National Crime Victimization Survey, 20 percent of students reported being bullied during the 201617 school year. Of those students, 15 percent reported being bullied online or by text, which is an increase from 11.5 percent during the 201415 school year.

During the 201617 school year, students’ reports of bullying online or by text were found to differ by sex, race, and school level. For instance, three times as many female students reported being bullied online or by text (21 percent) as male students (7 percent), and about 17 percent of White students reported being bullied online or by text, compared with 12 percent of students of other races. Also, a higher percentage of high school students reported being bullied online or by text (19 percent) than middle school students (12 percent).

In the 2017 School Crime Supplement (SCS), students reported being bullied online or by text in higher percentages than did students being bullied only in person in three key types of bullying.

  • Students who reported being bullied online said they were made fun of, called names, or insulted more often (74 percent) than students who reported being bullied in person only (63 percent).
  • 90 percent of students bullied online reported that rumors were spread about them, compared to 62 percent of those who reported being bullied in person only.  
  • 39 percent of students being bullied online reported that they were excluded from activities on purpose, compared to 23 percent of students who reported being bullied in person only.

 



 

Browse the full report for more bullying estimates from the 2016–17 school year.

 

By Rachel Hansen

 

References

Lessne, D., and Yanez, C. (2016). Student Reports of Bullying: Results From the 2015 School Crime Supplement to the National Crime and Victimization Survey (NCES 2017-015). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved March 28, 2019, from https://nces.ed.gov/pubs2017/2017015.pdf.

Yanez, C., and Seldin, M. (2019). Student Reports of Bullying: Results From the 2017 School Crime Supplement to the National Crime and Victimization Survey (NCES 2019-054). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Forthcoming.

Differences in Postsecondary Enrollment and Employment by Socioeconomic Status

New data suggest that the socioeconomic status of high school freshmen plays a role in their future education and employment.  

The data come from the NCES High School Longitudinal Study of 2009 (HSLS:09), which follows a nationally representative group of ninth-graders. In 2009, NCES measured the socioeconomic status (SES) of these students by collecting data on the income, occupation, and educational attainment of their parents or guardians. In 2016, NCES conducted a follow-up survey with the 2009 ninth-graders, gathering data on their educational and employment status.   

Data show that 2009 ninth-graders who were in the lowest-SES category were 20 percentage points more likely to be neither enrolled in postsecondary education nor working in 2016 than those in the highest-SES category (figure 1). These students were also 50 percentage points less likely to be enrolled in postsecondary institutions than those in the highest-SES category (figure 2).

 



 

These findings are just a glimpse into the insights on socioeconomic mobility that HSLS:09 can generate by linking data on parent and child educational attainment and employment.

Check out our recent spotlight indicator in the Condition of Education for more information on how the educational and employment outcomes of young adults varied in relation to family socioeconomic status.

 

By Joel McFarland

 

 

 

Taking Discovery to Scale

Along with my NCEE colleagues, I was excited to read the recent Notice Inviting Applications for the next cycle of Comprehensive Centers, administered by the Department’s Office of Elementary and Secondary Education.

As you can see in the notice, Regional Comprehensive Centers will “provide high-quality intensive capacity-building services to State clients and recipients to identify, implement, and sustain effective evidence-based programs, practices, and interventions that support improved educator and student outcomes,” with a special emphasis on benefitting disadvantaged students, students from low-income families, and rural populations.

With this focus on supporting implementation, Regional Comprehensive Centers (RCCs) can amplify the work of NCEE’s Regional Educational Laboratories (RELs) and What Works Clearinghouse (WWC). Learning from states, districts, and schools to understand their unique needs, and then being able to support high-quality implementation of evidence-based practices that align with those needs, has the potential to dramatically accelerate the process of improving outcomes for students.

RELs and the WWC already collaborate with today’s Comprehensive Centers, of course. But it’s easy to see how stronger and more intentional relationships between them could increase each program’s impact.

True to its name, the REL program has worked with educators to design and evaluate innovative practices – or identify, implement, and refine existing ones – to meet regional and local needs for more than 50 years. And since its inception in 2002, the WWC has systematically identified and synthesized high-quality evidence about the effectiveness of education programs, policies, and practices so that educators and other instructional leaders can put that information to use improving outcomes for students. But with more than 3.6 million teachers spread across more than 132,000 public and private schools nation-wide, making sure discoveries from education science are implemented at scale and with fidelity is no small feat. RCCs are welcome partners in that work.

This figure describes how RELs, the What Works Clearinghouse, and Regional Comprehensive Centers could most effectively collaborate across a continuum from discovery to scale.

RELs, the WWC, and Comprehensive Centers can play critical, complementary roles in taking discovery to scale (see Figure). With their analysis, design, and evaluation expertise, RELs – in partnership with states and districts, postsecondary institutions, and other stakeholders – can begin the process by designing and rigorously evaluating best practices that meet local or regional needs. (Or, as I will discuss in future messages, by developing and rigorously testing materials that support adoption of evidence-based practices.) The WWC follows, vetting causal impact studies, synthesizing their findings to better understand the strength of evidence that supports a practice and identifying its likely impact. Partners in the Comprehensive Centers can then “pick-up” those WWC-vetted practices, aligning them to needs of State and other clients, and supporting and sustaining implementation at scale. Finally, lessons learned from RCCs’ implementation efforts about what worked – and what didn’t – can be fed back to RELs, refining the practice and fueling the next cycle of discovery.

Those that follow the REL-WWC-RCC process know that what I’ve just described isn’t quite how these programs operate today. Sometimes, out of necessity, roles are more “fluid” and efforts are somewhat less well-aligned. The approach of “taking discovery to scale” depicted above provides one way of thinking about how each program can play a unique, but interdependent, role with the other two.

I have every confidence this is possible. After all, the North star of each program is the same: improving outcomes for students. And that means we have a unique opportunity. One we’d be remiss not to seize.

 

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance
Institute of Education Sciences
U.S. Department of Education

 

As always, your feedback is welcome. You can email the Commissioner at matthew.soldner@ed.gov.

 

 

NCEE is hiring!

The U.S. Department of Education’s Institute of Education Sciences (IES) is seeking professionals in education-related fields to apply for an open position in the National Center for Education Evaluation and Regional Assistance (NCEE). Located in NCEE’s Evaluation Division, this position would support impact evaluations and policy implementation studies. Learn more about our work here: https://ies.ed.gov/ncee.

If you are even potentially interested in this sort of position, you are strongly encouraged to set up a profile in USAJobs (https://www.usajobs.gov/) and to upload your information now. As you build your profile, include all relevant research experience on your resume whether acquired in a paid or unpaid position. The position will open in USAJobs on July 15, 2019 and will close as soon as 50 applications are received, or on July 29, 2019, whichever is earlier. Getting everything in can take longer than you might expect, so please apply as soon as the position opens in USAJobs (look for vacancy number IES-2019-0023).

 

Introducing the 2020 Classification of Instructional Programs (CIP) and Its Website

The National Center for Education Statistics (NCES) is pleased to announce the release of the 2020 Classification of Instructional Programs (CIP), which reflects the various programs of study being offered at postsecondary institutions around the country. This is the sixth edition of the CIP and contains more than 300 new programs of study, which can be searched on the new 2020 CIP website.

The CIP is updated about every 10 years to reflect changes in instructional program structures and the introduction of new fields of study. Beginning next year, postsecondary institutions will use the 2020 CIP when they report the degrees and certificates awarded for the 2020 Integrated Postsecondary Education Data System (IPEDS) Completions Survey.

The CIP is a taxonomy of instructional programs that provides a classification system for the thousands of different programs offered by postsecondary institutions. Its purpose is to facilitate the organization, collection, and reporting of fields of study and program completions. CIP Codes and IPEDS Completions Survey data are used by many different groups of people for many different reasons. For instance, economists use the data to study the emerging labor pools to identify people with specific training and skills. The business community uses IPEDS Completions Survey data to help recruit minority and female candidates in specialized fields, by identifying the numbers of these students who are graduating from specific institutions.  Prospective college students can use the data to look for institutions offering specific programs of postsecondary study at all levels, from certificates to doctoral degrees.

To allow sufficient time for institutions to update their reporting systems, NCES is releasing the 2020 CIP and the new website approximately one year before it will be implemented.

 



 

The 2020 CIP website has many features, including multiple search options, an FAQ section, resources, a help page, and contact information. Users can search the 2020 CIP by code or keyword and the resource page contains lists of new, moved, and deleted CIP codes as well as Word and Excel versions of the 2020 CIP and 2010 CIP. The website also contains an online data tool called the CIP Wizard, which enables users to focus on changes at a specific institution between the 2010 and 2020 CIPs.

 



 

The CIP Wizard requires users to specify an institution by either name or IPEDS ID, a unique identification number assigned by NCES. The Wizard then searches the last 3 years of the IPEDS Completions Survey and compiles the CIP codes used by that institution. The Wizard also crosswalks an institution’s 2010 CIP codes to its 2020 CIP Codes and generates a report that categorizes the codes into the following categories:

  • No substantive changes—codes that did not change from the previous version of the CIP
  • New codes—codes that were added to this version of the CIP
  • Moved codes—codes that were relocated and have two references: one in the former location  and one in the current location
  • Deleted codes—codes that were removed from the previous version of the CIP

By looking through the CIP Wizard report, an institution can see exactly what changes have been made to the CIP codes it used in the last 3 years of Completions Survey data.

 



 

The CIP Wizard also suggests new CIP codes that might be of interest to the user, allows the user to export a report as either a Word or Excel file, and creates a file of CIP codes that can be uploaded to an institution’s reporting system.

Over the next several months, NCES will be preparing web-based tutorials on how to use the CIP website and the CIP Wizard. Until then, users can reference a list of frequently asked questions and a detailed help document, and also submit  questions by email to CIP2020@ed.gov.

 

 

By Michelle Coon