Project Activities
An iterative development process was used to design, refine, and validate the EXPERT Program. Through observations and focus groups, researchers investigated current practices in data-based decision-making among special education and general education teachers to inform the development of the program components. The researchers developed the EXPERT Monitoring Tool, a web-based tool that automatically generates a progress graph based on entry of curriculum-based measurement (CBM) data and supports self-monitoring of instructional decision-making. The EXPERT Monitoring Tool was pilot tested with reading intervention teachers in two states, with the researchers facilitating ongoing data chats to support use of the tool and collect feedback. Next, a study was conducted with pre-service special education teachers to investigate instructional decisions made in response to sequentially presented CBM progress graphs and the influence of various graph-related features. These findings informed further refinement of the EXPERT Monitoring Tool and development of the initial professional development training delivered as part of the EXPERT Program. The implementation and promise of the fully developed EXPERT Program was tested through a mixed-methods research design pilot study. Participants completed an initial 3-hour professional development session, followed by weekly meetings with their individual EXPERT coach. Researchers also examined the costs of implementing this version of the EXPERT Program.
Structured Abstract
Setting
Project activities were conducted in elementary schools in Texas and virtually with in-service teachers across the nation.
Sample
Participants in the pilot study included 16 teachers who provided reading intervention to students with RD in grades 3-5 (i.e., special educators, interventionists, dyslexia specialists). Target students (n = 36) were nominated by teachers, all of whom had an identified learning disability in reading as per their Individualized Education Plan (IEP) and/or were receiving specialized services for dyslexia under IDEA Section 504. The program was delivered by EXPERT coaches (n = 4), all completing a doctoral degree in special education, with prior experience teaching students with disabilities and approximately 20 hours of training focused on the EXPERT Coaching Manual.
The aim of the EXPERT Program is to support teachers’ expertise in DBI, namely their ability to evaluate student assessment data and understand when ineffective interventions should be changed and how interventions should be individualized using evidence-based practices. The EXPERT Program includes three core components: (1) ongoing data collection; (2) self-monitoring procedures employed through the uniquely designed EXPERT Monitoring Tool; and (3) coaching to provide individualized knowledge and skill development for teachers. Coaching was guided by a manualized set of protocols adapted from Conjoint Behavioral Consultation (CBC), which has been shown to be an effective collaborative model through decades of empirical evidence. The EXPERT Program uses a problem-solving process facilitated by the coach wherein teachers are engaged collaboratively and simultaneously in all aspects of this process. The EXPERT Program also includes supplementary components to support coach’s delivery of individualized support for teachers’ professional learning: comprehensive stand-alone professional development training and a video repository of mini training modules.
Research design and methods
In the first year, researchers investigated current data use practices among special education and general education teachers through focus groups, interviews, and direct observation. This informed their development of the EXPERT Program and the training modules to be used to individualize coaching supports. Across the next 2 years, the EXPERT Monitoring Tool was developed and piloted with teachers who delivered reading intervention. Following an initial training, teachers integrated the web-based self-monitoring tool into their own practice and provided feedback during structured, virtual data chats with the researchers. A simulation study was conducted with pre-service special education teachers to gain deeper insight into graph interpretation processes. These findings informed iterative development work the following year, including development of the EXPERT Coaching Manual and a fidelity of implementation measure that captured adherence to the coaching model and coach-teacher discourse. In the final year of the project, researchers conducted a mixed-methods research design pilot study to explore implementation of the EXPERT Program and its promise for improving teachers’ data literacy skills and subsequent change in students' reading achievement.
Control condition
The pilot study design did not include a control condition.
Key measures
Teacher measures completed at pretest and posttest included the following: Teacher Data Use Survey (teachers’ use of data, attitudes toward data, and school-based supports), Data-Driven Decision-Making Efficacy and Anxiety Inventory (self-efficacy for data identification, analysis and interpretation, application of data to instruction, and anxiety related to data use), Teacher Understanding of Evidence-Based Literacy Instructional Practices (knowledge of best practices in elementary reading instruction, intervention, and assessment), and the CBM Knowledge Assessment for Progress Monitoring (knowledge of CBM and graph interpretation). Teachers also completed implementation measures at posttest, including app usability, treatment accepting, coaching alliance, and coaching evaluation. Students’ reading performance was assessed through the following measures: Woodcock Reading Mastery Test –Third Edition (word attack, word identification, and oral reading fluency), Test of Word Reading Efficiency – Second Edition (sight word efficiency and phonemic decoding efficiency), and CBM Oral Reading Fluency data collected by teachers. Coaches completed measures tapping perceptions of app usability, coaching alliance, and coaching evaluation. Additional data sources were collected to understand implementation of the EXPERT Program. This included teacher usage data and student graphs from the EXPERT Monitoring Tool, detailed coaching logs, and transcripts of all coaching meetings. Data were collected from project budgets and coaching logs to understand the cost of implementing the EXPERT Program, including costs for personnel, facilities, materials and equipment, and other miscellaneous costs.
Data analytic strategy
During the development of the intervention, qualitative data from focus groups, interviews, and observations were analyzed using content analysis to identify themes. Quantitative data from the knowledge and skills surveys were analyzed descriptively. Multilevel models were used to determine whether there was a statistically significant difference between pretest and posttest scores on teacher outcomes of interest. Differential change from pretest to posttest based on teacher knowledge and self-efficacy was estimated by expanding the model to include interaction terms that account for the moderating effects of these variables. Implementation data were analyzed descriptively and through analysis of transcripts derived from audio recordings of coaching meetings.
Cost analysis strategy
Researchers conducted a cost-efficiency analysis by calculating ratios that reflect the cost of producing the observed effects. This analysis focused on assessing the efficiency of the EXPERT Program in generating measurable improvements.
Key outcomes
The main findings of this project, as reported by the principal investigator, are as follows:
- There were increases in teachers’ reported data use, efficacy related to data collection and interpretation, and knowledge of CBM throughout the EXPERT Program.
- Coaches were able to implement the EXPERT Program with fidelity, resulting in high levels of satisfaction reported by both teachers and coaches.
- Analysis of coaching transcripts supports validity of the coaching model developed for the EXPERT Program. Coaches demonstrated higher levels of fidelity for meetings that followed more structured protocols than for those that customized the experience to meet the teacher’s individualized learning goals.
- There were substantial recruitment challenges experienced throughout the project, which would need to be considered in planning for a large-scale efficacy trial.
- Overall, the EXPERT Program shows promise as an approach to providing professional support focused on DBI.
People and institutions involved
IES program contact(s)
Project contributors
Products and publications
Project website:
Publications:
ERIC Citations: Find available citations in ERIC for this award here.
Select Publications:
Toste. J. R., Fluhler, S. K., Farris, E. A., & Chandler, B. W. (2025). What’s in a word? Analyzing students’ oral reading fluency to inform instructional decision-making. Intervention in School and Clinic.
Fry, E., Toste, J. R., Feuer, B., & Espin, C. A. (2024). A systematic review of CBM content in practitioner-focused journals: Do we talk about instructional decision-making? Journal of Learning Disabilities, 57(5), 275-290.
Toste, J. R., Filderman, M. J., Clemens, N. H., & Fry, E. (2024). Graph out loud: Pre-service teachers’ data decisions and interpretations of CBM progress graphs. Journal of Learning Disabilities, 58(1), 33-45.
Toste, J. R., Filderman, M. J., & Espin, C. (2023). Data teams in teacher preparation programs: Improving data-based instruction in reading. Intervention in School and Clinic, 59(1), 40-47.
Additional project information
Additional online resources and information:
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.