IES Blog

Institute of Education Sciences

New Report on Crime and Safety in Schools and on College Campuses

Crime in our nation’s schools and college campuses has generally declined over the past two decades, according to Indicators of School Crime and Safety 2019, a recently released NCES report. This report highlights new analyses of mental health services provided by public schools and the prevalence of school and school neighborhood problems. The report also covers topics such as victimization, school conditions, safety and security measures at school, and criminal incidents at postsecondary institutions.

In 2018, students ages 12–18 experienced 836,100 total victimizations (i.e., thefts and nonfatal violent victimizations) at school and 410,200 total victimizations away from school. These figures represent a rate of 33 victimizations per 1,000 students at school and 16 victimizations per 1,000 students away from school. From 1992 to 2018, the total victimization rate and the rates of specific crimes—thefts and violent victimizations—declined for students ages 12–18, both at school and away from school.

This edition of Indicators of School Crime and Safety examines new data on school shootings. While such events represent a small subset of the violent incidents that occur at schools, they are of high concern to those interested in the safety of our nation’s students. In school year 2018–19, there were 66 reported school shootings with casualties at public and private elementary and secondary schools (29 school shootings with deaths and 37 school shootings with injuries only). Between 2000–01 and 2018–19, the number of school shootings with casualties per year ranged from 11 to 66.

Student bullying was the most commonly reported discipline problem among public schools over the past two decades. In school year 2017–18, about 14 percent of public schools reported that bullying occurred among students at least once a week, representing a decrease from the 29 percent of schools that reported student bullying in 1999–2000. In 2017–18, about 15 percent of public schools reported that cyberbullying had occurred among students at least once a week either at school or away from school.

This edition of the report also contains an analysis of new survey items that asked administrators at schools serving fifth-graders about issues in neighborhoods around their schools. In spring 2016, “crime in the neighborhood” and “selling or using drugs or excessive drinking in public” were the two most commonly reported school neighborhood problems. Thirty-four percent of fifth-graders attended schools where crime in the neighborhood was a problem, and 31 percent attended schools where selling or using drugs or excessive drinking in public was a problem. For the five school neighborhood problems examined in the report, fifth-graders attending schools where these were a big problem or somewhat of a problem consistently had lower scores in reading, mathematics, and science than did those attending schools where these were not a problem.



In addition to reporting data on student victimizations and school safety conditions, Indicators of School Crime and Safety 2019 also includes information on the programs and practices that schools had in place to promote a safe school. The new report includes a special analysis of mental health services provided by public schools. During the 2017–18 school year, 51 percent of public schools reported providing diagnostic mental health assessments to evaluate students for mental health disorders. Thirty-eight percent of public schools reported providing treatment to students for mental health disorders. When asked about whether certain factors limited their efforts to provide mental health services in a major way, 52 percent of public schools reported that inadequate funding was a major limitation, and 41 percent reported that inadequate access to licensed mental health professionals was a major limitation.



The report also looked at safety and security practices. In school year 2017–18, about 92 percent of public schools had a written plan in place for procedures to be performed in the event of an active shooter. Forty-six percent had a plan for procedures in the event of a pandemic disease. Between 2005–06 and 2017–18, the percentage of public schools that reported having one or more security staff present at school at least once a week increased from 42 to 61 percent.

Shedding light on postsecondary campus safety and security, the report shows that the number of reported forcible sex offenses on college campuses increased greatly while the overall number of reported criminal incidents at postsecondary institutions fell. Between 2001 and 2017, the number of reported forcible sex offenses on college campus increased 372 percent (from 2,200 to 10,400 offenses) while the overall number of criminal incidents reported on postsecondary campuses decreased by 31 percent (from 41,600 to 28,900 incidents). However, in the most recent data (between 2016 and 2017), the overall number of criminal incidents reported on postsecondary campuses increased by 2 percent. In 2017, a total of 958 hate crimes were reported on college campuses, of which the most common types were destruction, damage, and vandalism (437 incidents) and intimidation (385 incidents). Race, religion, and sexual orientation were the categories of motivating bias most frequently associated with these hate crimes.

To view the full Indicators of School Crime and Safety 2019 report, please visit https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2020063.

 

By Ke Wang, AIR

New Education Data from the Household Pulse Survey

Recognizing the extraordinary information needs of policymakers during the coronavirus pandemic, NCES joined a partnership with the Census Bureau and four other federal statistical agencies to quickly develop a survey to gather key indicators of our nation’s response to the global pandemic. Thus, the experimental 2020 Household Pulse Survey began development on March 23, 2020, and data collection began on April 23, 2020. This new survey provides weekly national and state estimates, which are released to the public in tabular formats one week after the end of data collection.

The Household Pulse Survey gathers information from adults about employment status, spending patterns, food security, housing, physical and mental health, access to health care, and educational disruption. The education component includes questions about the following:

  • The weekly time spent on educational activities by students in public and private elementary and secondary schools
  • The availability of computer equipment and the Internet for instructional purposes
  • The extent to which computer equipment and the Internet for students were provided or subsidized

Since this survey is designed to represent adults 18 years old and over, the responses to the education questions concern students within the households of adults 18 years old and over, not the percentage of students themselves.

In the Household Pulse Survey during the weeks of April 23 through May 5, adults reported that their average weekly time spent on teaching activities with elementary and secondary students in their household was 13.1 hours. These results differed by educational attainment: adults who had not completed high school reported a weekly average of 9.9 hours in teaching activities with children, whereas adults with a bachelor’s or higher degree reported 13.9 hours (figure 1). In terms of the average weekly time spent on live virtual contact between students in their household and their teachers, adults reported a lower average of 4.1 hours.



Adults’ reports about the school instruction model need to be interpreted carefully because respondents could choose multiple types of approaches. A higher percentage of adults with a bachelor’s or higher degree (84 percent) reported that classes for elementary and secondary students in their household had moved to a format using online resources than did adults who had completed some college or an associate’s degree (74 percent), adults who had completed only high school (64 percent), or adults who had not completed high school (57 percent).

Higher percentages of adults with higher levels of education than of adults with lower levels of education reported that computers and the Internet were always available for educational purposes for elementary and secondary students in their households (figure 2).



The percentage of adults who reported that the school district provided a computer or digital device for children in their households to use at home for educational purposes was higher for adults who had not completed high school (44 percent) than for adults with a bachelor’s or higher degree (33 percent). Also, a higher percentage of adults who had not completed high school than of adults with higher levels of educational attainment reported financial assistance for student Internet access.

It is important to note that the speed of the survey development and the pace of the data collection efforts have led to policies and procedures for the experimental Household Pulse Survey that are not always consistent with traditional federal survey operations. Data should be interpreted with proper caution.  

More information on the Household Pulse Survey, detailed statistical tables, and microdata sets are available at https://www.census.gov/householdpulsedata. The Household Pulse Survey site includes breakouts of the data by other characteristics, such as race/ethnicity. In addition to participating in the development of this new survey, NCES has also generated new analyses based on existing data that respond to new needs for policy information, such as the availability of the Internet for student learning.

 

By Xiaolei Wang, AIR

Using Mistakes as a Vehicle for Learning in Mathematics: From Research to Practice at Scale in Education

Every student makes mistakes. But not every student is given the opportunity to learn from mistakes. Left unaddressed, the mathematical misconceptions that underlie many mistakes can keep students from progressing in mathematics.

 

At the request of districts in the Minority Student Achievement Network (MSAN), a Strategic Education Research Partnership (SERP) team was convened in 2007 to address a widening achievement gap in Algebra I. The team was charged with identifying an intervention strategy, subject to several district constraints:

  1. The solution would need to be applied to all students in the regular classroom to avoid the stereotype threat associated with separating students based on performance and to protect the intervention from budget cuts that target supplemental, after-school, and summer programs first.
  2. A new curriculum was off the table because it would create upheaval for a time and would be followed by a decline in student performance during the period of adjustment.
  3. Extensive teacher training was considered undesirable because it would be costly and because algebra teachers consider themselves more expert in mathematics teaching than central office staff who would be requiring the training.

 

Julie Booth joined the partnership, and with funding from IES, led the iterative development and testing of worked example assignments that, with the input of teachers and administrators, fit within the routines of the classroom. The result—AlgebraByExample—consists of 42 uniquely designed assignments that address misconceptions, harness the power of explanation, and use mistakes as a vehicle for learning.

Typical math assignments require students to solve problems on their own. If a student’s work is incorrect, the student may never focus on what went wrong. ByExample assignments also give students problems to solve, but they first provide a solution to a similar problem that is marked right or wrong. Students are prompted with questions that target common misconceptions and errors before solving a similar problem on their own. Each assignment contains several strategically designed item pairs:

 

 

Designed in collaboration with teachers from districts in several states, the assignments can be easily incorporated into any Algebra I curriculum and teachers can choose in what way and in what order to use them. The assignments were tested in randomized trials in classrooms in eight districts with more than 6,000 students. Not only did students using AlgebraByExample improve an average of 7 percentage points on an assessment of standardized test items, students at the lower end of the distribution improved the most. The PDF downloads of the assignments are freely available for anyone to use.

The success of AlgebraByExample  led to  further IES funding of MathByExample for Grades 4 and 5 and GeometryByExample for high school geometry .

 

Resources:

AlgebraByExample website

MathByExample website

Booth et al, 2015

NSF Stem for All Video Submission 2019

 

Interview with Dr. Suzanne Donovan (SERP), Dr. Julie Booth (Temple University), and Allie Huyghe (SERP), the developers of the ByExample interventions.

 

 

Was it part of the original plan to develop an intervention that could one day be used at scale in schools?

Yes. SERP partnerships begin with problems of practice nominated by district partners, but the partnership agreement distinguishes SERP from a consultant. The intention from the start is to frame the problem and design a solution that can be used at scale. SERP has developed in-house, user-centered design expertise so that resources (such as the ByExample products) developed through partnerships meet the needs of teachers and students. Products scale when they improve the experience of teachers and students. Both the model and the internal design capacity allow SERP to move from problem framing through research, development, and dissemination of a product with IES grant funding.

 

Describe the initial research and development that occurred.

Dr. Julie Booth drafted initial assignments drawing on the mathematics misconceptions literature. SERP held regular partnership meetings with teachers and administrators at which assignments were reviewed and additional misconceptions were nominated for attention in the assignments. Administrators agreed to randomization of the assignments across classrooms and within-teacher. Assignments were first tested in individual topic blocks and revised in accordance with student performance data, observations, and teacher feedback. A year-long pilot study was then conducted using the full set of assignments.

 

Beyond IES or ED grants, what additional funding was needed to develop the intervention?

For the ByExample work, additional funding was provided by the Goldman Sachs Foundation in the initial phase to support partnership formation, problem framing, and the solution generation. IES grants funded the research and development, along with initial dissemination activities to make the materials available to the public. We were also able to develop an online platform to allow for digital use with the IES grant funds.

 

What model was used for dissemination and sustainability?

The assignments are available as free downloads on SERP’s website, and as printed workbooks through SERP’s partner print-on-demand company. They have been publicized through online communications, journal articles, presentations at conferences of various types, social media, and word of mouth. There will be a small fee for use of the digital platform to support its maintenance, but the PDFs will remain as free downloads. We have been able to sustain the collaboration of the partnership team by responding to requests from educators to expand the approach to other grade levels and submitting additional proposals to IES that have been awarded.

 

What advice would you provide to researchers who are looking to move their research from the lab to market? What steps should they take? What resources should they look for?

First, I would note that it is difficult to persuade educators to use a product that solves a problem they don’t believe they have. Listen to educators and apply research expertise to address the challenges that they experience on a day-to-day basis. Design for ease of use by teachers. No matter how good your strategy or marketing is, if it’s too much work for an already busy teacher to use, you may get uptake by a few committed teachers, but not at scale. Finally, pay attention to where teachers get their information. For AlgebraByExample, we got a big boost from the Marshall Report, produced by a teacher for other teachers to call attention to usable research.  

 

In one sentence, what would you say is most needed for gaining traction and wide scale use by educators?

Design for the routines of the classroom.

 


Suzanne Donovan, PhD, is the founding Executive Director of the SERP Institute, an education research, development, and implementation organization incubated at the National Academies. SERP leads collaborations of educators, researchers, and designers to generate research-based, scalable, and sustainable solutions to critical problems of practice. 

Julie Booth, PhD, is a Professor of STEM Education and Psychology and the Deputy Dean of Academic and Faculty Affairs at Temple University’s College of Education and Human Development. Her work focuses on translating between cognitive science and education to better understand students’ learning and improve instruction, primarily in mathematics education. She is currently an Executive Editor for the Journal of Experimental Education.

Allie Huyghe is the Assistant Director of the SERP Institute, where she manages several projects, including the IES-funded MathbyExample and GeometryByExample projects. She is also intricately involved with other SERP areas of work, participating in the design of materials from early development through release to the public.

 

This interview was produced by Christina Chhin (Christina.Chhin@ed.gov) and Edward Metz (Edward.Metz@ed.gov) of the Institute of Education Sciences. This is the fifth in an ongoing series of blog posts examining moving from university research to practice at scale in education.​

 

 

Introducing REL 2022

As I write this, my colleagues and I at the Regional Educational Laboratory (REL) Program are thinking about a single number: 535. No, we’re not concerned about 535 because it represents the number of voting members of Congress, though that would be a good guess. We’re also not thinking about Interstate 535, the “2.78-mile-long Auxiliary Interstate Highway spur of I-35 in the U.S. states of Minnesota and Wisconsin,” though now I’m intensely interested in why it might be that, at least according to Wikipedia, this road is “known locally as the ‘Can of Worms’ interchange.” Instead, my colleagues and I are excited about 535 because it represents the number of days between now and the start of the next cycle of the REL program, affectionately known as REL 2022.

Over a year ago, we began a process that culminates in the awarding of contracts to run each of our regional labs. We are excited to share our preliminary thoughts about the contours of REL 2022 through a Request for Information, or RFI, which we have posted hereI hope you will take time to read the RFI. If you have questions or suggestions after doing so, I hope you are moved to comment. Details on how to offer your feedback can be found in the RFI.

Importantly, we aren’t proposing to radically restructure the REL program. Instead, we are retooling some existing expectations and adding a few new features. Below, I’ve highlighted a few proposed changes that merit special attention.

The purpose of RELs is to improve student outcomes. Not to put too fine a point on it, but everything that takes place in REL 2022 should be in service of improving student outcomes. This does not mean that every REL project will, by itself, have a directly observable impact on achievement. But the work of any given REL, in concert with the efforts of those with whom it works, should be trained on a singular focus: bettering the lives of the students through education. There is no other, better, or higher calling.

We accomplish our purpose by working in partnership with stakeholders to support their use of evidence-based practices. Evidence-based practice is “baked in” to the statute that authorizes the REL program, and the importance of building and using evidence in education—and government more generally—is reiterated throughout federal law. (See, for example, the Every Student Succeeds Act of 2015 and the Foundations for Evidence-based Policymaking Act of 2018.) However, our emphasis on evidence isn’t rooted in a statutory imperative. Instead, it’s based on a set of core beliefs about our work: that researchers and educators can strengthen education via the rigorous application of the scientific method; that resources, including money and time, are constrained and that efforts with demonstrated effectiveness should be prioritized; and that each and every student deserves the best of “what works” in education, no matter their circumstance.

Nothing changes if nothing changes. In the REL 2022 cycle, we are explicitly asking RELs to think of themselves as “change agents.” This expectation is, I believe, entirely new to the REL Program and is likely to be uncomfortable to some. For that reason, it is helpful to be clear about what we’re expecting and why. Here goes.

I daresay that, no matter how proud they might be of their students and their educators, there is not a state chief, a district superintendent, or building principal who would report they are serving each of their students as well as they wish they could. (If you’re the one who does, please stop reading this blog and call me. I want to share your successes!) Each of those leaders has something they want to do better on behalf of their students and are contemplating, if not actively pursuing, change. It is our hope that RELs can join them in making change, with evidence in hand and research tools at the ready. REL reports, resources, and trainings are not ends unto themselves. They are means to enable the change efforts of local, state, and regional education leaders, working on behalf of students to improve important outcomes.

RELs work in partnership. Education research and technical assistance must be done in partnership with those it is meant to inform. Absent that, it is likely to fail to achieve its goals. At best, potentially positive impacts will be blunted. At worst, harm will be done. There’s a simple solution: collaboration that authentically engages stakeholders in all phases of project design and execution. That isn’t, I realize, as simple to do as it is to write.

As vendors consider the REL 2022 cycle, we ask that they keep two things in mind about what we’ve traditionally called partnerships. First, there are no necessary restrictions on who RELs can partner with when working with stakeholders to achieve stakeholder goals. Does it make sense to partner across levels of education within a state? Do it. Is there a state or national advocacy association that would accelerate a partner’s progress? Engage it. Is there are role for business or industry? Leverage it. A second and closely related concept is that there are no restrictions on partnerships’ functional forms. In general, it does not matter one whit to IES whether you prefer NICs, DBIR, or any other particular form of research partnership. What does? That RELs build projects in partnership—however and with whomever—intentionally, with the goal of supporting partners’ change efforts to achieve the goals they have identified.

We encourage deeper, not broader, work. We believe RELs are more likely to achieve success when they focus partnerships on clearly defined problems of policy or practice in specific geographies. A “Six-State Research Alliance on High School Graduation” can do important and meaningful work—but the process of agreeing on the work to be done and the targets to be met, seeing that work through to completion, and then achieving pre-specified goals is likely to be exceptionally difficult. The “South-Central Kansas Partnership for Kindergarten Readiness” or the “Maricopa County Alliance for Reducing Chronic Absenteeism in High Schools” may be more likely to achieve impact. This is not to say that lessons learned locally should not be shared regionally or nationally, or that groups with common interests might not form “communities of practice” or other networks for the purpose of sharing information or building connection. Rather, we ask RELs be strategic in scoping their highest-intensity work.

We define success as achieving measurable stakeholder goals. Evaluating the impact of research and technical assistance projects is notoriously hard. Often, program managers and the evaluators with whom they work are forced to satisfice, relying upon end-user self-reports of the quality, relevance, and usefulness of a provider’s work. Counts of outputs, such as report downloads and attendees served, are particularly common metrics reported in evaluation studies. Satisfaction is the coin of the realm. Lest I be accused of throwing stones inside my own glass house, let me be clear that we currently use these very measures to characterize the effectiveness of the current REL program.

In REL 2022, it is our intention to shift focus beyond outputs to emphasize outcomes. We will ask RELs to demonstrate, on a regular basis, that they are making progress toward the goals stakeholders set for important student outcomes at the outset of their work, with the acknowledgment that outputs are often critical to achieving a long-term goal and that satisfaction can be an important leading indicator. In 2027, the mark of success won’t be a glowing narrative from a state superintendent or school superintendent about the REL cycle just passed. Instead, it’ll be seeing that the quantifiable goals those leaders set for their work with the REL program were achieved.   

Putting RELs’ capacity for rigorous R&D to work. Finally, there is one manifestly new requirement for RELs as part of the 2022 cycle, one that I am particularly excited about because it brings together the best of two NCEE programs: the RELs and the What Works Clearinghouse™ (WWC). As part of the 2022 cycle, each REL will be required to develop—and then evaluate—a comprehensive toolkit based on a WWC Practice Guide, helping educators instantiate evidence-based practices in the classroom. RELs already have experience taking the content from Practice Guides and transforming them into tools for educators. Two examples include Professional Learning Community guides for both foundational reading and English learners. Similarly, North Carolina State University’s Friday Institute has looked to Practice Guides for inspiration to develop massive open online courses (MOOCs), including foundational reading and fractions. None have been evaluated for efficacy. Of course, the development and testing of these new toolkits will follow the expectations set above, including the expectation that strong and inclusive partnerships are at the root of all high-leverage work.

My NCEE colleagues and I are excited about the possibilities that REL 2022 represents. The REL program has a proud history and a strong track record of service to local, state, and regional stakeholders. We hope that, as you review the REL 2022 RFI, you’ll find the next iteration of the program continues in that tradition. As always, I welcome your feedback.

Matthew Soldner

Commissioner, National Center for Education Evaluation and Regional Assistance

 

NHES Data Files Provide Researchers Supplemental Information on Survey Respondents’ Communities

Increasingly, researchers are merging survey data with data from external sources, such as administrative data or different surveys, to enhance analyses. Combining data across sources increases the usefulness of the data while minimizing the burden on survey respondents.

In September, the National Household Education Surveys Program (NHES) released restricted-use supplemental geocode data files that use sample respondents’ addresses to integrate the 2016 NHES Parent and Family Involvement in Education (PFI), Early Childhood Program Participation (ECPP), and Adult Training and Education (ATES) survey data with data from other collections. The supplemental geocode files include additional geographic identifiers, characteristics of respondents’ neighborhoods and local labor markets, radius-based measures of household proximity to job search assistance and educational opportunities, and, for surveys focused on children, school district identifiers based on home addresses and school district characteristics.

The new data can complement researchers’ analyses of data from all three surveys. Researchers can expand their analyses of school choice and access to K–12 schooling options using the PFI survey data. Those interested in analyses of decisions about children’s early education can use the ECPP survey data to look at the availability of Head Start programs, preschools in private schools near children’s homes, and the prevalence of prekindergarten programs in local school districts. Researchers interested in nondegree credential attainment and training for work can use data from the ATES to find information on local labor markets and the number of American Job Centers near respondents’ homes.

The NHES:2016 restricted-use supplemental geocode files are available to restricted-use license holders to be used in conjunction with the NHES:2016 survey data files. To access the full set of NHES:2016 geocode supplemental restricted-use data files, apply for a restricted-use license. You can also browse the list of variables in the supplemental geocode files.

 

By Emily Isenberg and Sarah Grady, NCES