Although high school graduation rates are rising--the national rate was 82 percent during the 2013/14 school year (U.S. Department of Education, 2015)--dropping out remains a persistent problem in the Midwest and nationally. Many schools now use early warning systems to identify students who are at risk of not graduating, with the goal of intervening early to help students get back on track for on-time graduation. Although research has guided decisions about the types of data and indicators used to flag students as being at risk, little is known about the impact of early warning systems on students and schools--and in particular, whether these systems do help get students back on track. This study, designed in collaboration with the REL Midwest Dropout Prevention Research Alliance, examined the impact and implementation of one early warning system--the Early Warning Intervention and Monitoring System (EWIMS)--on student and school outcomes. To assess the impact of EWIMS on student and school outcomes, 73 high schools in three Midwest Region states were randomly assigned to implement EWIMS during the 2014/15 school year (37 EWIMS schools) or to continue their usual practices for identifying and supporting students at risk of not graduating on time and to delay implementation of EWIMS until the following school year (36 control schools). The study included 37,671 students in their first or second year of high school, with 18,634 students in EWIMS schools and 19,037 students in control schools. EWIMS and control schools and students were similar on all background characteristics prior to random assignment. The study examined the impacts of EWIMS on indicators of student risk and on student progress in school after the first year of EWIMS adoption. The study found that EWIMS reduced the percentage of students with risk indicators related to chronic absence and course failure but not related to low GPAs or suspension: (1) The percentage of students who were chronically absent (missed 10 percent or more of instructional time) was lower in EWIMS schools (10 percent) than in control schools (14 percent); this 4 percentage point difference was statistically significant; and (2) The percentage of students who failed one or more courses was lower in EWIMS schools (21 percent) than in control schools (26 percent); this 5 percentage point difference was statistically significant; (3) The percentage of students who had a low GPA (2.0 or lower) was 17 percent in EWIMS schools and 19 percent in control schools; this difference was not statistically significant. However, sensitivity analyses that used continuous GPA data instead of the binary risk indicator showed that, on average, GPAs were higher in EWIMS schools (2.98) than in control schools (2.87); this difference was statistically significant; and (4) The percentage of students who were suspended once or more was 9 percent in both EWIMS and control schools; there was no statistically significant difference. EWIMS did not have an impact on student progress in school. That is, there was not a statistically significant difference between EWIMS and control schools in the percentage of students who earned insufficient credits to be on track to graduate within four years (14 percent in both). At the school level, EWIMS did not have a detectable impact on school data culture, that is, the ways in which schools use data to make decisions and identify students in need of additional support. In nearly all participating schools, overall implementation of the EWIMS seven-step process was low, and implementation was challenging. Nevertheless, EWIMS schools were more likely than control schools to report using an early warning system and having a dedicated team to identify and support at-risk students, but EWIMS schools did not differ from control schools in the frequency of data review or the number and type of interventions offered. This report provides rigorous initial evidence that even with limited implementation during the first year of adoption, using a comprehensive early warning system can reduce the percentage of students who are chronically absent or who fail one or more courses. These short-term results are promising because chronic absence and course failure in grades 9 and 10 are two key indicators that students are off track for on-time graduation. However, because the past research linking indicators to on-time graduation is correlational, it is not yet known if improving these indicators leads to improving on-time graduation rates. Also, EWIMS did not have a detectable impact on other measured indicators that are related to students' likelihood of on-time graduation, including low GPAs, suspensions, and earning insufficient credits. Future research is needed to better understand the mechanisms through which EWIMS had an impact on chronic absence and course failure and why EWIMS did not affect other outcomes. In particular, studies could focus on identifying which staff actions and student experiences lead to improved student outcomes. Studies should also examine whether schools achieve improved overall implementation in subsequent years and whether (and how) the observed impacts fade, grow larger, or extend to other risk indicators (low GPAs and suspensions); to intermediate outcomes (including student persistence and progress in school); and to long-term outcomes (including dropout and on-time graduation rates). The following are appended: (1) Planned implementation of the Early Warning Intervention and Monitoring System; (2) Recruitment, random assignment, and study sample; (3) Data collection and analytic methods; (4) Detailed findings and supplementary analyses; and (5) Disclosure of potential conflicts of interest.
ERIC DescriptorsAcademic Persistence, Achievement Gains, At Risk Students, Attendance Patterns, Average Daily Attendance, Control Groups, Dropout Prevention, Experimental Groups, Failure, Grade Point Average, Graduation, Graduation Rate, High School Students, Intermode Differences, Intervention, Low Achievement, Online Surveys, Outcome Measures, Program Effectiveness, Program Implementation, Progress Monitoring, Randomized Controlled Trials, Regression (Statistics), School Holding Power, Scoring Rubrics, Statistical Significance, Suspension
Midwest | Publication Type:
Impact Study | Publication
Date: April 2017