IES Blog

Institute of Education Sciences

Closing the Opportunity Gap Through Instructional Alternatives to Exclusionary Discipline

According to the most recent GAO analysis of the U.S. Department of Education Civil Rights Data Collection, Black students, boys, and students with disabilities are disproportionately suspended or expelled in K-12 public schools. The reasons for these disparities may not always be clear, but the consequences are stark—suspended or expelled students miss out on opportunities to learn. What can be done to minimize this opportunity gap?

In 2018, researchers at the University of Oregon received a grant to develop an alternative to exclusionary discipline for middle schools. The Inclusive Skill Building Learning Approach (ISLA) will function as a Tier I universal intervention in middle schools that use Positive Behavioral Interventions and Supports (PBIS). ISLA systems and practices will give teachers other options for dealing with misbehaving students along with strategies to support students when they return to the classroom following a trip to the principal’s office. I recently spoke with Dr. Rhonda Nese, the principal investigator for Project ISLA, about how she became interested in this work and how she and her colleagues are tackling this challenge of narrowing the opportunity gap in middle school classrooms.

A photo of the University of Oregon research team

How did you become interested in the issue of disproportionate discipline?

I had been deeply interested in the school-to-prison pipeline research for many years, but the light switch went on for me when I was spending time in a middle school through my work on another project. I started noticing a pattern of students, mostly boys and students of color, sitting in the office every time I walked into this school. And I’m talking about lots of students! This office would be flooded with kids; not learning, not speaking with anyone, just sitting and looking downcast. And it was disturbing.

When I asked the assistant principal what the students were doing in the office, she shared that, for whatever reason, the students were sent out of class and needed to meet with an administrator. So, I became curious. On average how much class time were they missing? I was floored to learn that the average was three days of missed instruction, which is the equivalent of over 1200 minutes of learning. And the deeper I dug the more I realized how pervasive the problem was. In addition to the racial disparities I saw in the kids being excluded, it was also clear that the students who were missing instruction were those who needed to be in class the most: students living in poverty, students struggling academically, and students receiving special education services. And the process of sitting and waiting was doing the students a tremendous disservice academically, behaviorally, and emotionally. I saw firsthand the issues I needed to begin addressing immediately, and I knew I found my passion.

How does Project ISLA extend or build on your earlier research?

I started developing ISLA during my postdoc years when I was deep in the PBIS literature, examining predictors of sustained implementation of evidence-based practices, and beginning to explore interventions to address implicit biases in discipline disproportionality. So, I was able to combine what I was learning from practitioners and from scientific findings to craft an intervention that was rooted in behavioral theory, embedded in preventative practices, and incorporated teacher and student voice.

I also became clearer with myself and others that ISLA is not about “fixing” kids: it is about changing adult behavior to improve student outcomes and relationships. Now through our iterative development process, our team is learning so much about what it takes to support school staff with making this work their own, how we get buy-in from the school community, and how we braid the ISLA work with other preventative practices they already have in place.

What are the core components of the ISLA intervention? What are its essential practices? What have you learned so far about what it takes to implement ISLA in middle schools?

One of our greatest goals is to help educators make this philosophical shift where they view sending a student out of class as a really big deal, and thus, should be reserved for situations in which the teachers and students need support with problem solving, skill building, and making amends. In order to accomplish this, we begin with spending a lot of time with our educators developing and revisiting preventative practices to improve the classroom environment, and in turn, reduce the need for exclusion. This includes working with educators to develop and implement universal relationship-building strategies, graduated discipline practices within the classroom, neutralizing routines to reduce the impact of implicit biases on their decision making, and mechanisms for supporting students in effective and respectful ways. We then layer on a systematized process for students and teachers to request breaks, and then on top of this we have our processes that are provided to students in the event that they are sent out to help them get back to class faster and with the skills to make amends with their teacher. This includes a debrief, skills coaching, and reconnection supports with a front office staff member and a process for their teacher to listen reflectively and agree on how they will problem solve with the student if there’s an issue in the future.  

Getting folks to move away from exclusionary discipline practices takes a lot of time and a lot of patience, because suspensions and other forms of exclusion are deeply tied to systems of oppression that have been prevalent in the United States. And especially in middle school, there’s this pervasive myth that students should know how to behave by this point, and so anything to the contrary is seen as willful defiance as opposed to a skill gap. Unfortunately, there’s no quick fix, and ISLA is certainly not a silver bullet. In fact, we call ISLA Tier I+ because it starts with universal preventative practices and then adds supports for students and teachers who need more. Because of all the myth busting and support layering we’re doing, working with a team of educators in each school has been critical for buy-in and implementation. They help guide our iterative changes, give us strategies to consider, and are the voice to their colleagues. They are invested in the work because they are helping to develop it for their schools. And our work is so much more meaningful because of them.

 

Dr. Nese and her team are mid-way through their project. Now that they have completed the iterative development process they are testing the usability and feasibility of ISLA in new middle schools this year. In their pilot study of promise next year, they will see if ISLA increases instructional time for students and improves student-teacher relationships and school climate. In addition to creating ISLA user guides and materials, the team plans to develop technical reports, video tutorials, trainings, and webinars that will be available through the Office for Special Education Programs (OSEP) Center on Positive Behavioral Interventions and Supports website.


Written by Emily Doolittle, National Center for Education Research Team Lead for Social Behavioral Research. This is the third in a series of blog posts that stems from the 2020 Annual Principal Investigators Meeting. The theme of the meeting was Closing the Gaps for All Learners and focused on IES’s objective to support research that improves equity in access to education and education outcomes. Other posts in this series include Why I Want to Become an Education Researcher and Diversify Education Sciences? Yes, We Can!

 

ASSISTments: From Research to Practice at Scale in Education

ASSISTments is a free, web-based formative assessment platform for teachers and students in Grades 3 through 12. The tool is designed for teachers to easily assign students math problems from OER textbooks such as Illustrative Math and EngageNY, existing item banks, or items they have developed on their own. ASSISTments will continually assess students as they solve math problems and provide immediate feedback and the chance to try again. The computer-generated reports provide teachers with information to make real-time adjustments to their instruction. Teachers can use it with their school’s existing learner management systems, such as Google Classroom and Canvas. Watch a video here.

 

 

Over the past 13 years, ASSISTments was developed and evaluated with the support of a series of IES and National Science Foundation awards. With a 2003 IES award to Carnegie Mellon University and Worcester Polytechnic Institute (WPI), researchers created the first version of ASSISTments. The system was populated with Massachusetts high-stakes mathematics test questions and the tutoring for the questions was authored by WPI staff with assistance from local teachers. After students completed problems assigned by the teacher, reports provided teachers with information about question difficulty and the most commonly submitted wrong answers, initiating class discussions around the completed assignments. In 2007, researchers at WPI received an award to build additional functionalities in the ASSISTments program so that teachers could assign supports (called “skill builders”) to students to help them master content.  An additional eight grants allowed the researchers to create other features. 

With a 2012 IES research grant award, SRI evaluated the efficacy of the ASSISTments program as a homework tool for academic learning.  In the study, the researchers took all 7th grade textbooks in the State of Maine and added answers to homework problems into ASSISTments.  The results of the efficacy trial demonstrated that teachers changed their homework reviewing behavior, mathematical learning improved an extra three quarters of a year of schooling, and using ASSISTments reliably closed achievement gaps for students with different achievement levels. ASSISTments is currently being evaluated again through two IES studies, with over 120 schools, to attempt to replicate this result. To view all publications related to ASSISTments, see here.

As of 2020, ASSISTments has been used by approximately 60,000 students with over 12 million problems solved.

 

Interview with Neil Heffernan and Cristina Heffernan

From the start of the project, was it always a goal that ASSISTments would one day be used on a wide scale?

We created ASSISTments to help as many teachers and students as possible. After we learned that the ASSISTments intervention was effective, we set the goal to have every middle school student in the country get immediate feedback on their homework. We created ASSISTments to be used by real teachers and have been improving it with each grant. Because of the effectiveness of ASSISTments, we kept getting funded to make improvements allowing our user base to grow.

At what point was ASSISTments ready to be used at a large scale in schools?

We were ready in year one because of the simplicity of our software. Now that we integrated seamlessly with Google Classroom, most teachers can use the system without training!

ASSISTments is backed by a lot of research, which would make some think that it would be easy for many schools to adopt. What were (or are) the biggest obstacles to ASSISTments being used in more schools?

A big obstacle has been access to technology for all students. The current environment in schools is making that less and less of a barrier. Now, teachers are looking for effective ways to use the computers they have.      

What options did you consider to begin distributing ASSISTments?

We had major companies try to buy us out, but we turned them all down. We knew the value was being in control so we could run research studies, let others run research studies and AB test new ideas. It was important to us to keep ASSISTments free to teachers. It is also a necessity since we crowdsource from teachers.

How do you do marketing?

Our biggest obstacle is marketing. But we are lucky to have just received $1 million in funding from a philanthropy to create a nonprofit to support the work of making our product accessible. Foundation funding has allowed us to hire staff members to write marketing materials including a new website, op-eds, blog posts and press releases. In addition to our internal marketing staff member, we work extensively with The Learning Agency to get press and foundation support for ASSISTments.

What costs are associated with the launch and distribution of ASSISTments, including marketing? Will a revenue model needed sustain ASSISTments over time?

When creating ASSISTments, we didn’t want a traditional business model based on schools paying. Our vision for future growth, instead, focused on crowdsourcing ideas from teachers and testing them. We are trying to replicate the Wikimedia platform idea created by Jimmy Wales. He crowdsources the content that makes up the encyclopedia, so it must be free. We envision using ASSISTments to help us crowdsource hints and explanations for all the commonly used questions in middle school mathematics.  

Do you have any agreement about the IP with the universities where ASSISTments was developed?

The ASSISTments Foundation was founded in 2019 and supports our project work in tandem with Worcester Polytechnic Institute due to our integration with research. The close relationship takes care of any issues that would arise with intellectual property. Additionally, the fact that we are a nonprofit helps address these issues.

How do you describe the experience of commercializing ASSISTments? What would you say is most needed for gaining traction in the marketplace?

Even though we are free, we do have several competitors. To gain traction, we have found that word of mouth is an effective disseminator and our positive efficacy trial result. Currently, there are many teachers on Facebook sharing how much they like ASSISTments. We also attend conferences and are working on an email campaign to get new users onboard.

Do you have advice for university researchers seeking to move their laboratory research into widespread practice?

Make sure your work is accessible and meaningful! We are solving a super-pervasive problem of homework in schools. Everyone finds meaning in making homework better.


Neil Heffernan (@NeilTHeffernan) is a professor of computer science and Director of Learning Sciences and Technologies at Worcester Polytechnic Institute.  He developed ASSISTments not only to help teachers be more effective in the classroom but also so that he could use the platform to conduct studies to improve the quality of education.  

Cristina Heffernan (@CristinaHeff) is the Lead Strategist for the ASSISTments Project at WPI. She began her career in education as a math teacher in the Peace Corps and after went on to teach middle school math in the US.  She began working with teachers while a graduate student at the University of Pittsburgh. As one of the co-founders of ASSISTments, Cristina has nurtured the system to be a tool for teachers to improve what they already do well in the classroom. 


This interview was produced by Edward Metz of the Institute of Education Sciences. This is the fourth in an ongoing series of blog posts examining moving from university research to practice at scale in education.​

Education Technology Platforms to Enable Efficient Education Research

Education research is often a slow and costly process. An even more difficult challenge is replicating research findings in a timely and cost-effective way to ensure that they are meaningful for the wide range of contexts and populations that make up our nation’s school system.

In a recent op-ed, IES Director Mark Schneider and Schmidt Futures Senior Director for Technology and Society Kumar Garg pitched the idea that digital learning platforms may be a way to accelerate the research enterprise. These platforms will enable researchers to try new ideas and replicate interventions quickly across many sites and with a wide range of student populations. They could also open the door for educators to get more involved in the research process. For example, Learn Platform supports districts as they make decisions about purchasing and implementing products in their schools, and ASSISTments provides infrastructure for researchers to conduct cheaper and faster studies than they would be able to do on their own.

IES Director Mark Schneider and NCER Commissioner Liz Albro recently attended a meeting sponsored by Schmidt Futures focused on these issues. Two major takeaways from the meeting: first, there is already progress on building and using platforms for testing interventions, and, second, platform developers are enthusiastic about integrating research capabilities into their work.

As we consider how we can support platform developers, researchers, and education personnel to co-design tools to enable more efficient, large scale research on digital learning platforms, several questions have arisen:  

  1. What digital learning platforms already have a large enough user base to support large scale research studies?
  2. Are there content areas or grade levels that are not well supported through digital learning platforms?
  3. What are the key features that a platform needs to have to support rigorous tests and rapid replication of research findings? 
  4. What are the barriers and challenges for companies interested in participating in this effort?
  5. What kinds of research questions can best be answered in this research environment?
  6. What kind of infrastructure needs to be developed around the platform to enable seamless collaborations between education stakeholders, researchers, and product developers?

We know there are some of you have already given these questions some thought. In addition, there are other questions and issues that we haven’t considered. We welcome your thoughts. Feel free to email us at Erin.Higgins@ed.gov and Elizabeth.Albro@ed.gov. And join NCER’s Virtual Learning Lab in their virtual workshop “Designing Online Learning Platforms to Enable Research” on April 17th, 3:00pm-5:00pm Eastern Time. Learn more about the workshop here.

Learning from CTE Research Partnerships: Using Data to Address Access and Equity Barriers in Massachusetts

As part of our ongoing blog series aimed at increasing state research on career and technical education (CTE), Corinne Alfeld, Research Analyst at the Institute of Education Sciences (IES) and Austin Estes, Senior Policy Associate at Advance CTE, are conducting interviews with individuals who are part of successful CTE State Director research partnerships. The third interview was with Cliff Chuang at the Massachusetts Department of Education and Shaun Dougherty of Vanderbilt University. Note: this interview, from February 5, 2020, has been edited for length and clarity.Photograph of Cliff Chuang and Shaun Dougherty

Could you start by talking about the projects that you’ve worked on, your research questions, and how you settled on those research questions?

Shaun - It grew out of my dissertation work that was using some of the school data and then some of the statewide data from Massachusetts. It started pretty narrowly but the director of research was happy enough with what I was able to do that she talked about whether we could address some additional questions, and more data was becoming available. That more or less triggered the expansion, and then with Cliff coming into the role it became a two-way conversation that was more explicitly about what’s of academic interest and what’s of interest or of need on the practice and policy side for CVTE (career/vocational technical education).

Cliff – I would say that the particular catalyst for our most recent partnership is our desire as an agency to understand the waitlist demand issues related to chapter 74 CVTE in Massachusetts. If I recall correctly, we put out an RFR (request for responses)[1] for a research partner to help us analyze different aspects of who is and is not getting access to CVTE programs in Massachusetts. And Shaun and his partner Isabel at Harvard, a grad student there, their bid was selected. From that project there have been a lot of offshoots through the CTEx exchange collaboration that Shaun and others have established. We’ve been engaged in a lot of informal research inquiry as well as additional formal research that uses that data.

Could you talk a little bit about what the findings were from that project and what have been implications in the academic space but also on the policy front, how are you using those findings to change policy in Massachusetts?

Shaun – The basic findings were that in fact there is much more interest in these high-quality CTE programs, these chapter 74-approved programs in these standalone technical high schools, than can be met by current supply. This was more confirmatory evidence with a little more granularity and maybe confidence in the figures than was possible previously.

Cliff – Shaun’s team also helped us look at just the straight enrollment data comparisons, which is still not as ideal as looking at applicant data. It was helpful to have a more rigorous definition of what data protocols are needed around application and admissions. We have now made the decision to collect waitlist data systematically at the state level to allow researchers like Shaun to more rigorously analyze across the board the attributes of who’s interested in voc tech, who’s getting in, who’s applying, etc.

I think it also stimulated a variety of program initiatives on the part of state government in Massachusetts to increase access to CVTE programs through collaborative partnerships like After Dark, which is an initiative that seeks to utilize shop space in our technical schools after the regular school day paired with academics provided by a partner academic school to get more kids the technical training that we are unable to do in the standard day program structures.

I would also add that Shaun is continuing other aspects of the research now that we’re very excited about, based in part on some of the research they did do to look at longer term trends of students and their outcomes post high school.

Shaun – The first order concern is that lots of people want [access to CVTE programs] and there’s a limited amount of it, so should we have more?

The second order concern – but certainly not secondary question – is one about equity and whether or not the students who were applying and the students who were getting access look like a representative cross-section of the community at large.  We know that students who choose CTE or select a lot of it are maybe different than those who don’t, but we don’t know a ton about whether and how we expect students who are making those investments to look like the overall population or whether or not access concerns lead to equity concerns.

Cliff – We would like to look more closely at whether the gaps are simply due to application gaps – which is still an issue in terms of kids not applying – or whether there are actual gaps related to who is applying and getting in. That was the data gap that we haven’t quite been able to close yet. But Shaun was able to create some comparative data that is just based on enrollment that has allowed us to engage in these conversations. We’re having the conversation about trying to expand the number of seats available so there’s less of a waitlist, but also to ensure that access into the existing seats is equitable and doesn’t disadvantage certain subgroups over others.

Over the course of the partnership, what have been some of the major challenges and hurdles that you’ve faced? What are some of the speedbumps that you’ve hit getting things formalized up at the front?

Shaun – Fortunately, one thing that we didn’t face, although I know it’s an obstacle in many places, is processes related to how one gets permissions and access to the data. In fact, as the process has evolved, having those structures in place has made it really easy, so that if Cliff and I say “hey, we’d like to add this,” it’s a pretty easy amendment of the MOU (memorandum of understanding). And then the people who deliver the data get approval and then they deliver it through a secure portal.

Cliff – I would also say that researchers left on their own probably would have had much less success in getting district participation in the survey study we did together. I, on the other hand, am someone with positional authority at the state level and established relationships that I can leverage to get that participation. And then I can pass it off to the research team that actually has the expertise and bandwidth to execute on the very labor-intensive data collection, both quantitatively and qualitatively.

It seems like you have a good partnership and a good synergy between the state office and the research team. If you were talking to CTE leaders and other researchers, what are some strategies and practices to make sure that partnership runs effectively and can be as impactful as possible?

Cliff – I think it’s important to have someone in the role of a researcher director type person whose job it is to facilitate these partnerships and to do some of the nitty gritty around data sharing, MOUs, etc. The other thing I would say is to have a commitment to an evidence base in terms of policymaking, and have people in the programmatic leadership who see the value of that and have enough knowledge of how research functions to parlay whatever policy or relational capital they have to support the research agenda.

Shaun – I think sometimes overcoming the incentives related to purely academic publishing restricts some of the willingness of some academic researchers to invest or to think about important questions in practice and policy. It’s being willing to realize that strong partnerships with local and state agencies means that more and better work can be done, and the work can have impact in real time. There is something very fulfilling and useful and practical about taking that approach from a research standpoint and then, if you come from practice like I did, then it helps ground the work.

Other blog posts in this series can be viewed here.

 

[1] Cliff explained that this is a formal process by which they solicited proposals for pay. “What’s been nice is that because it’s a partnership, Shaun has secured funding from other sources so there’s not an explicit contractual arrangement always. Aspects of the research that are ongoing are follow-ons from the original study. We have an interest in continuing to mine the data long-term to inform practice and policy.”

Activities for Students and Families Stuck at Home due to COVID-19 (Coronavirus)

As I write this blog post, my 4-year-old is spraying me with a water sprayer while I am desperately protecting my computer from a direct hit. Earlier, while I was listening in on a meeting, she yelled out “hi!” anytime I took myself off mute. Balancing work and raising kids in this bizarre situation we find ourselves in is an overwhelming experience. When schools started closing, some parents resorted to posting suggested schedules for kids to keep up a routine and deliver academic content during the day. These were wonderful suggestions. As someone whose dissertation focused on how people learn, I should be applauding such posts, but instead, they filled me with a sense of anxiety and guilt. How am I supposed to balance getting my work done while also designing a rigorous curriculum of reading, writing, and math instruction for a kid whose attention span lasts about 10-20 minutes and who needs guidance and adult interaction to learn effectively? Let’s take a step back and recognize that this situation is not normal. We adults are filled with anxiety for the future. We are trying to manage an ever-growing list of things—do we have enough food? Do we need to restock medications? What deadlines do we need to hit at work?

So here is my message to you, parents, who are managing so much and trying desperately to keep your kids happy, healthy, and engaged: recognize that learning experiences exist in even the simplest of interactions between you and your kids. For example—

  • When doing laundry, have your child help! Have them sort the laundry into categories, find the matching socks, name colors. Create patterns with colors or clothing types (for example, red sock, then blue, then red, which comes next?).
  • Find patterns in your environment, in language (for example, nursery rhymes), and when playing with blocks or Legos. Researchers have shown that patterning is strongly related to early math skills.
  • Talk about numbers when baking. I did this with my daughter yesterday morning. We made muffins and had a blast talking about measuring cups, the number of eggs in the recipe, and even turning the dial on the oven to the correct numbers. Older kids might be interested in learning the science behind baking.
  • Take a walk down your street (practicing good social distancing of course!) and look for different things in your environment to count or talk about.
  • Bring out the scissors and paper and learn to make origami along with your kids, both for its benefits for spatial thinking and as a fun, relaxing activity! In this project, researchers developed and pilot tested Think 3d!, an origami and pop-up paper engineering curriculum designed to teach spatial skills to students. The program showed promise in improving spatial thinking skills.
  • If you choose to use screen time, choose apps that promote active, engaged, meaningful, socially interactive learning.
  • If you choose to use television programs, there is evidence showing that high quality educational programs can improve students’ vocabulary knowledge.

Hopefully these examples show that you can turn even the most mundane tasks into fun learning experiences and interactions with your kids. They may not become experts in calculus at the end of all of this, but maybe they will look back fondly on this period of their life as a time when they were able to spend more time with their parents. At the end of the day, having positive experiences with our kids is going to be valuable for us and for them. If you have time to infuse some formal learning into this time, great, but if that feels like an overwhelmingly hard thing to do, be kind to yourself and recognize the value of even the most simple, positive interaction with your kids.

Written by Erin Higgins, PhD, who oversees the National Center for Education Research (NCER)'s Cognition and Student Learning portfolio.