Project Activities
Structured Abstract
Setting
Sample
Research design and methods
Data analytic strategy
All analyses will be completed separately for each of the four BRCs, but the same statistical procedures are followed across the BRCs. To test for equivalence between the randomly assigned intervention and comparison groups, statistical significance will be calculated for each BRC using two-sample t-tests for continuous and ordinal variables, and Fisher's Exact test of dichotomous variables. Statistical significance of the treatment effect will be obtained either from HLM regressions or from robust regressions. Effect sizes will be calculated by dividing the estimated treatment effect by the standard deviation of the baseline measurements. In addition, key differences among the four interventions will be discussed for each BRC and descriptive comparisons will address the fourth research question posed for the NBRCC: How do these effects vary across the examined interventions?
People and institutions involved
IES program contact(s)
Products and publications
Products: The expected products from this contract include:
Journal article, monograph, or newsletter
Wagner, M.M., Sumi, W.C., Woodbridge, M.W., Javitz, H.S., and Thornton, S.P. (2009). The National Behavior Research Coordination Center: Coordinating Research and Implementation of Evidence-Based School Interventions for Children With Serious Behavior Problems. Journal of Emotional and Behavioral Disorders, 17(4): 244-249. doi:10.1177/1063426609343593
Nongovernment report, issue brief, or practice guide
Woodbridge, M., Sumi, W.C., Thornton, P., Javitz, H., Wagner, M., and Shaver, D. (2009). Evaluation Results for Four Behavior Interventions. Menlo Park, CA: National Behavior Research Coordination Center, SRI International.
Proceeding
Sumi, W.C., Woodbridge, M., and Wagner, M. (2009). The National Behavior Research Coordination Center: Overview and Final Findings. In Proceedings of the 22nd Annual Research Conference (pp. 120-121). Tampa, FL: Research and Training Center for Children's Mental Health, University of South Florida.
Wagner, M., Sumi, W.C., and Woodbridge, M. (2007). Measuring the Effectiveness of School-Based Interventions for Children With Serious Behavior Problems. In Proceedings of the 20th Annual Research Conference (pp. 205-210). Tampa, FL: Research and Training Center for Children's Mental Health, University of South Florida.
Supplemental information
Co-Principal Directors: Michelle Woodbridge and Carl Sumi
- Annual Reports by the NBRCC on information learned about effective practices, and
- Presentations of NBRCC findings to consumers, practitioners, and policymakers.
Specific Research Questions:
- Do the examined interventions improve the behavior at school of students with severe behavior problems?
- Do the examined interventions improve the academic performance and participation of students with severe behavior problems?
- Are the effects of the examined interventions sustained for 1 year?
- How do these effects vary across the examined interventions?
- For whom (e.g., student grade level, gender, severity of behavior problems) do the examined interventions work best? Least well?
- In what classroom and school contexts (e.g., schools with behavior support systems, more highly qualified teachers) do the examined interventions work best? Least well?
- How does fidelity (i.e., procedural adherence, quality, and intensity), social validity from the teacher's perspective, and the degree of teacher/consultant alliance vary across the examined interventions?
- What variations in context (e.g., school context, teacher characteristics) relate to implementation of the intervention?
Duration: 5 years (September 30, 2004–September 30, 2009)
A common set of key instruments is used across the four BRCs which covers the domains of student, classroom, school, and implementation. Student-level information is collected from school records (IEP/504 plan status, services received, discipline referrals, absences, suspensions, expulsions), a student demographic survey (age, grade, race/ethnicity, free/reduced lunch status, English-language learner status), observations of student academic engaged time, direct assessments of reading skills, and teacher ratings of social skills, problem behaviors, and academic competence. Classroom-level information is collected from teacher surveys of the classroom characteristics (e.g., instructional practices, student composition), teacher characteristics (e.g., education, experience, credentials), teacher supports received, and self-reported skills working with students with behavior problems. An observation of the classroom climate is also conducted. School-level information is collected from an administrator report (e.g., mobility, incidents of violence), and data are pulled from the Common Core Data, a national database on school characteristics (e.g., grade levels served, student/teacher ratio, enrollment). Observations/interviews are also completed to assess the school environment relative to positive behavior supports. Data on intervention implementation are collected through measures of fidelity, social validity, and alliance.
BRC specific information:
The Oregon Research Institute BRC is evaluating the First Steps to Success intervention, a 3-month intervention incorporating three components (screening, school intervention, and home intervention) in an effort to improve the behavior and academic performance of students with severe behavior problems. The intervention is based on the theory that a preventive approach (rather than a reactive one) to early signs of poor social adjustment using secondary prevention goals and involving both teachers and families to support students' behavior change will more effectively transform emerging severe behavior problems.
The University of Washington BRC is evaluating the Check, Connect, and Expect (CC&E) program. CC&E is based on the theory that relationships with school staff, reinforcement of clear expectations and social behavior, and engagement in school activities contribute to improved academic and social outcomes of students. Therefore, the intervention focuses on improving students' positive relationships and prosocial behavior via increased school staff reinforcement and feedback. Students not completely successful with CC&E will receive an additional intensive, functionally based, individualized intervention developed by the district behavior specialist, a behavior coach, and the classroom teacher.
The Vanderbilt BRC is evaluating the Classroom Management and Academic Tutoring (CMAT) program. The program instructs teachers in classroom management techniques designed to improve classroom behavior, including the effective use of a peer contingency game. In addition, an academic tutoring component will consist of tutoring in reading. These interventions are based on the theory that student behavior is directly affected by classroom environment and practices. Training and motivating teachers to engage in practices known to improve the classroom environment will result in improved student behavior and learning. Academic success hinges on reading skills and will be enhanced by direct reading instruction and indirectly by improved student behavior.
The University of South Florida BRC is evaluating the Prevent-Teach-Reinforce (PTR) intervention. PTR is modeled after a positive behavior supports approach and is a team process through which an individualized intervention is developed and implemented. PTR is based on the theory that well-conducted functional behavioral assessments and sound positive behavior support plans for children with severe behavior problems will: (a) decrease the occurrence of maladaptive target behaviors, (b) increase the occurrence of appropriate prosocial behaviors, and (c) consequently produce positive outcomes in the areas of behavior, academics, and lifestyle changes for the child and family.
Current Status: As of December 2006, the NBRCC has provided technical assistance and coordinated/troubleshooted BRC data collection efforts (e.g., concerns about data collection with Spanish-speaking participants). Although data collection began in the fall of 2005, the variability in the duration of interventions and in research designs across the four BRCs results in baseline, posttest, and follow-up measurements that are staggered over time. Although some data to address questions of effects are currently available for some projects, data for the full sample will not be available until Summer 2007.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.