Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Project VIABLE-II: Unified Validation of Direct Behavior Rating (DBR) in a Problem-solving Model
Center: NCSER Year: 2011
Principal Investigator: Chafouleas, Sandra Awardee: University of Connecticut
Program: Social, Emotional, and Behavioral Competence      [Program Details]
Award Period: 7/1/11–6/30/15 Award Amount: $2,332,829
Type: Measurement Award Number: R324A110017

Purpose: Despite an increased emphasis on prevention and early intervention for improving students' social, emotional, and behavioral skills, there is a substantial gap in the availability of behavioral assessments to identify students in need of additional support (screening) and monitor response (progress monitoring). Previous work by this research team led to the development of Direct Behavior Rating (DBR) scales as an assessment method that combines the strengths of systematic direct observation and behavior ratings scales. Recommended instrumentation and procedures, as well as the psychometric adequacy related to DBR scales' use for assessing academic engagement, respectful, and disruptive behavior were reported. Through this work, the need for a unified screening and progress monitoring tool emerged, and the research team will now extend this systematic line of research to evaluate DBR scales for use in supporting problem-solving models of service delivery for both screening and progress monitoring.

Project Activities: The research activities include: a) validation of DBR for use in screening; b) validation of DBR for progress monitoring; and c) examination of foundational psychometric properties. With regard to screening assessment, the researchers will establish appropriate cut-points for current and predictive student risk in both elementary and middle school student samples located in districts across three states. Concurrently, the team will examine traditional psychometric indicators (e.g., construct validity, criterion-related validity, reliability) along with other forms of information relevant to score interpretation and use (e.g., social and educational consequences, relevance, and utility).

Products: Products of this project include a validated DBR tool for screening and progress monitoring, and published reports on the psychometric indicators and information relevant to score interpretation and use.

Structured Abstract

Setting: The project will take place in elementary and middle schools in Connecticut, North Carolina, and New York.

Population: Approximately 2,000 elementary and middle school students and their teachers across 120 classrooms will participate.

Intervention: There is no intervention. The assessment method, Direct Behavior Rating (DBR) scales, combines the strengths of a rating scale (e.g., efficient data recording) and the benefits of systematic direct observation (e.g., data recording that occurs at the time and place of the behavior). In brief, DBR is completed by a rater (e.g., teacher) who quantifies perception of a well-defined behavior of a student by rating that behavior in close proximity (time and place) of that behavior observation. For example, a teacher might use DBR to estimate the proportion of time Student A was academically engaged during science instruction on Tuesday. DBR was developed for rating academic engagement, respectful, and disruptive behaviors.

Research Design and Methods: The study will be conducted in four phases: validation for use in screening, validation for use in progress monitoring, examination of foundational psychometric properties, and unified validation in a problem-solving model (e.g., utility). Data from DBR and criterion measures will be collected from teachers three times per year to allow for descriptive analyses of DBR data trends over time. A stratified random subsample of students in the screening group will be tracked over four years (assessed two times per year) to determine whether DBR measures are predictive of long-term behavioral challenges. Single-case methodology will be used to investigate the impact of DBR use on teacher problem-solving behavior and track DBR sensitivity to change. In addition, multi-trait, multi-method analyses will be used to determine how much of the variation in obtained DBR scores is attributable to the traits (behaviors) relative to the method. A portion of teachers in this phase will also participate in a verbal protocol analysis study where they will "think aloud" as they watch videotapes of students and rate student behavior.

Control Condition: There is no control condition.

Key Measures: Key measures include: DBR scales; behavior rating scales including the Social Skills Improvement System (SSIS) and Behavior Assessment System for Children-2, Behavioral and Emotional Screening System (BESS) and Teacher Report Scale (BASC-2 TRS); systematic direct observation; student record review; Curriculum-Based Measurement–Reading (CBM–R); the research team developed Problem-Solving-Behavior (P-S-B) Checklist; and the Usage Rating Profile (a measure of perceived utility).

Data Analytic Strategy: A variety of analytic methods will be used to assess the DBR. Receiver operating characteristic (ROC) analysis will be used to set DBR cut scores. Consistency and consensus estimates and hierarchical linear modeling (HLM) will be used to examine trends over time. Multi-trait, multi-method analyses will be conducted using confirmatory factor analysis. For progress monitoring, descriptive data will be summarized and formative data will be examined using visual analysis strategies. Qualitative data collected through the "think aloud" study will be reviewed and coded for key topics of interest to the study (e.g., teacher ability to differentiate between engagement and disruptive behavior).

Related IES Projects: Enhancing Ci3T: Building Professional Capacity for High Fidelity Implementation to Support Students' Educational Outcomes (Project ENHANCE) (R324N190002); Exploring the Status and Impact of School-Based Behavior Screening Practices in a National Sample: Implications for Systems, Policy, and Research (R305A140543); Project VIABLE: Validation of Instruments for Assessing Behavior Longitudinally and Efficiently (R324B060014), Project EASS-E: Expanding Approaches to School Screening With Equity (R305A220249).


Journal article, monograph, or newsletter

Johnson, A.H., Miller, F., Chafouleas, S.M., and Kooken, J. (2011). Direct Behavior Rating: Updates and Current Research Directions. The Connecticut School Psychologist, 17(3): 5–6.

Johnson, A.H., Miller, F.G., Chafouleas, S.M., Riley-Tillman, T.C., Fabiano, G.A., and Welsh, M.E. (2016). Evaluating the Technical Adequacy of DBR-SIS in Tri-Annual Behavioral Screening: A Multisite Investigation. Journal of School Psychology, 54: 39–57. doi:10.1016/j.jsp.2015.10.001

Kooken, J., Welsh, M.E., McCoach, D.B., Miller, F.G., Chafouleas, S.M., Riley-Tillman, T.C., and Fabiano, G. (in press). Test Order in Teacher-Rated Behavior Assessments: Is Counterbalancing Necessary? Psychological Assessment. doi:10.1037/pas0000314

Miller, F. G., Crovello, N., and Chafouleas, S. M. (in press). (2017). Progress Monitoring the Effects of Daily Report Cards Across Elementary and Secondary Settings Using Direct Behavior Rating- Single Item Scales . Assessment for Effective Intervention. doi:10.1177/1534508417691019

Miller, F.G., Chafouleas, S.M., Riley-Tillman, T.C., and Fabiano, G.A. (2014). Teacher Perceptions of the Usability of School-Based Behavior Assessments. Behavioral Disorders, 39(4): 201–210.

Miller, F.G., Cohen, D., Chafouleas, S.M., Riley-Tillman, T.C., Welsh, M.E., and Fabiano, G.A. (2015). A Comparison of Measures to Screen for Social, Emotional, and Behavioral Risk. School Psychology Quarterly, 30(2): 184–196. doi:10.1037/spq0000085

Miller, F.G., Patwa, S.S., and Chafouleas, S.M. (2014). Using Direct Behavior Rating – Single Item Scales to Assess Student Behavior Within Multi-Tiered Systems of Support. Journal of Special Education Leadership, 27(2): 76–85. Full text

Miller, F.G., Riley-Tillman, T.C., Chafouleas, S.M., and Schardt, A.A. (2016). Direct Behavior Rating Instrumentation: Evaluating the Impact of Scale Formats. Assessment for Effective Intervention. doi:10.1177/1534508416658007

Riley-Tillman, T. C., Sims, W. & Cohen, D. (in press). Daily Report Cards and Direct Behavior. Ratings as an Intervention/Assessment Package. Assessment for Effective Intervention.