Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Improving Student Behavior Supports in Elementary Schools Through a Team Communication and Coaching Fidelity Tool
Center: NCSER Year: 2023
Principal Investigator: Davis, Carol Awardee: University of Washington, Seattle
Program: Social, Emotional, and Behavioral Competence      [Program Details]
Award Period: 5 years (09/012023 - 08/31/2028) Award Amount: $3,799,513
Type: Initial Efficacy Award Number: R324A230014

Co-Principal Investigators: Pullmann, Michael; Spaulding, Scott

Purpose: This study will test the efficacy of a technology tool designed to improve the use and effectiveness of Tier 3 function-based behavior supports for students with chronic challenging behaviors. In the past two decades, there has been an increased focus on systems-wide positive behavioral supports to help student learning. However, educators often struggle to provide appropriate intervention options for students with complex needs. Barriers to implementing such Tier 3 interventions within a multi-tiered system of supports (MTSS) framework include time to communicate and collaborate, beliefs about how children learn, and lack of behavioral expertise. The technology tool ibestt (integrating behavior support and team technology) was developed to remediate these barriers by guiding intervention teams through the process of providing supports. This project will evaluate the efficacy of ibestt for improving the behavior and education outcomes of elementary school students with challenging behavior, explore potential mediators and moderators, and determine the intervention's cost-effectiveness.

Project Activities: This project will use a stepped-wedge design to test the effects of ibestt on coach, teacher, and student outcomes. The research team will also explore factors that mediate the effects of ibestt; examine whether school, coach, and student factors moderate the relationships among ibestt, predicted mediators, and outcomes; evaluate implementation outcomes; and determine the cost and cost-effectiveness of ibestt.

Products: This project will result in evidence of ibestt's efficacy for improving outcomes for students in need of Tier 3 behavioral supports along with revised training manuals for each role in Tier 3 teams and school-wide training materials. Products will also include a final dataset to be shared, peer-reviewed publications and presentations, and additional dissemination products that reach education stakeholders such as practitioners and policymakers.

Structured Abstract

Setting: This project will take place in elementary schools implementing behavioral MTSS in urban and rural parts of Washington, Illinois, and Kentucky.

Sample: The project participants will include 3-5 coaches (school-based personnel who serve on the school's behavior support team) per school in each of 33 elementary schools, for a total of about 132 coaches. Approximately 5 teachers in each school will also participate, each working with one student and their family, leading to an anticipated total of approximately 165 teachers and 165 families/students. Students will qualify if they are in general and/or special education classrooms, experience chronic behavioral challenges, and are referred by a teacher or MTSS school team for Tier 3 services.

Intervention: The intervention, ibestt, is a technology tool that addresses challenges of implementing Tier 3 supports. Following a teacher's request for help with an individual student, the intervention is used by teams made up of "coaches" (behavior specialists) and teachers, as well as others such as psychologists, counselors, and parents. ibestt guides these teams through each step of the process of developing and implementing positive behavior supports, such as conducting functional behavioral assessments and creating quality behavior improvement plans while avoiding common pitfalls that can interfere with effective implementation (including lack of coordination, documentation, or coaching). The ibestt tool provides easy access to data collection within the application for efficient data-based decision making among team members; provides teams with access to documents, coaching resources, and features embedded within the technology tool; and makes professional development materials easily accessible for self-directed training and ongoing support.

Research Design and Methods: The research team will use a stepped-wedge design in which all cohorts begin in the control condition in the first year and are then randomly assigned to cross over to the intervention condition at different starting points. The first cohort will begin the intervention condition at the beginning of year 2, the second cohort will start in year 3, and the third cohort in year 4. Each cohort will remain in the intervention condition after their crossover point. Random assignment will be stratified to balance school- and district-level factors. Coaches and teachers will complete surveys at the beginning, middle, and end of each school year to assess potential mechanisms of change (such as knowledge of behavioral supports, quality of communication and collaboration, and self-efficacy to use functional behavioral assessment), implementation variables (ibestt usability, feasibility, acceptability, appropriateness, adoption, and use), and potential moderators (school characteristics, school leadership support, school implementation climate, and coach behavioral expertise). Teachers and parents will complete surveys and interviews pre- and post-intervention (14 weeks after baseline) to assess student behavioral outcomes, and teacher-student relationship. Additional data collection will include academic records for selected students at the end of each school year as well as time diaries and interviews with a random sample of participants (school leaders, teachers, and coaches) to assess costs.

Control Condition: Within the stepped-wedge design, all schools will begin in the control condition before moving into the intervention phase. In the control condition, schools will engage in their usual Tier 3 individualized behavior supports.

Key Measures: Key student outcome measures include the Behavior Assessment System for Children, Student-Teacher Relationship Scale—Short Form and academic records (attendance, disciplinary actions, grades). ibestt implementation outcomes will be measured through coach and teacher ratings on the Acceptability of Implementation Measure, Implementation Appropriateness Measure, Feasibility of Implementation Measure, System Usability Scale, use of the website/app with an enrolled student, and participant reports on the amount of ibestt use. Open-ended items will probe for explanations of these scores. Teacher and coach mechanisms of change —potential mediators—will be assessed byresearcher-developed measures of quality of team communication, collaboration, and efficiency and quality of coaching practices; the ibestt Self-Efficacy Measure; a survey item on the degree to which coaches and teachers use data-based decision making; the Beliefs About Behavior Survey; and the percentage of teachers in each school who refer a student. Behavior plan implementation and fidelity (teacher and coach outcomes) will be assessed using the Technical Adequacy Tool for Evaluation and degree of plan completion.Moderators will be assessed using data on school characteristics (school size, demographics, percent of student receiving free or reduced-price meals, percent of students with disabilities, attendance rate, and disciplinary rates), the SWPBIS Tiered Fidelity Inventory, the School-Implementation Leadership Scale, the School Implementation Climate Scale, and information on coach behavioral expertise (experience and training).

Data Analytic Strategy: The research team will analyze the quantitative data to determine ibestt's impact on outcomes and mechanisms of change using mixed effects models that account for nesting of time, student, teacher, coach, and school, and will include two types of time trends—cohort time and study time. Implementation measures will be analyzed with mixed methods, using qualitative data to expand on findings from the quantitative analyses to help explain participant scores. Qualitative analysis will include a conventional content analysis to derive meaning without identifying a priori codes. Potential mechanisms of change will be tested using multi-level mediation that extends path analysis to a multi-level framework. Moderators and moderated mediation will also be tested.

Cost Analysis: The ingredients method will be used to estimate costs by identifying the quantity and quality of all resources required to implement the intervention and assigning prices to each. The intervention's cost effectiveness will be analyzed to determine the annual cost per student for each unit of effectiveness, focusing on behavior and academic outcomes.

Related IES Projects:Development of a Web-based Integrated Behavior Support and Teacher Coaching System for Early Childhood Settings (R324A180061)