Skip Navigation
archived information
Skip Navigation

Back to Ask A REL Archived Responses

REL Midwest Ask A REL Response

Data Use

February 2020


What research is available on attendance rates and high absenteeism?


Following an established Regional Educational Laboratory (REL) Midwest protocol, we conducted a search for research reports, descriptive studies, and policy overviews on attendance rates and high absenteeism, specifically in the Midwest. In addition, we focused on identifying resources related to improving student attendance. For details on the databases and sources, keywords, and selection criteria used to create this response, please see the Methods section at the end of this memo.

Below, we share a sampling of the publicly accessible resources on this topic. References are listed in alphabetical order, not necessarily in order of relevance. The search conducted is not comprehensive; other relevant references and resources may exist. For each reference, we provide an abstract, excerpt, or summary written by the study’s author or publisher. We have not evaluated the quality of these references but provide them for your information only.

Research References

Arellano, A., Grech, M., & Hubbard, L. (2019). Opportunity for all: 2019 state of Michigan education report. Ann Arbor, MI: Education Trust-Midwest. Retrieved from

From the ERIC abstract: “Since the launch of the Michigan Achieves! campaign, the Education Trust-Midwest (ETM) has tracked Michigan’s performance and progress towards the goal of becoming a top ten state for education opportunity and achievement, focusing on key data-driven indicators. It’s important for Michiganders to understand how the state’s students are performing compared to other states and the nation overall, especially in a globally connected and competitive world. Data from the National Assessment of Educational Progress (NAEP), which is the largest nationally representative assessment that provides for long-term comparisons of what America’s students know and can do in various subject areas, can help provide this national perspective. Results from leading education states on the NAEP assessment—also commonly known as the national assessment or the Nation’s Report Card—show that it’s possible to improve education outcomes dramatically. There are many other indicators, which ETM reports on annually, that also signal that Michigan’s trajectory needs to change. They are: (1) Student Attendance; (2) Access to Rigorous Coursework; (3) College Readiness; and (4) College Attainment. A recent poll of Michigan parents conducted by EPIC-MRA and The Education Trust-Midwest entailed phone interviews with 600 parents in Michigan with one or more children age 18 or younger. This major survey was conducted to better understand how parents-essential stake holders in public education, think about issues impacting children.”

Balu, R., & Ehrlich, S. B. (2018). Making sense out of incentives: A framework for considering the design, use, and implementation of incentives to improve attendance. Journal of Education for Students Placed at Risk (JESPAR), 23(1–2), 93–106. Retrieved from

From the ERIC abstract: “Accumulating evidence indicates that student attendance is closely tied to a range of educational outcomes, and yet millions of students are chronically absent each year. Under the Every Student Succeeds Act (ESSA), schools are now held accountable for their students’ attendance at a scale this country has never before seen. As such, this is a crucial time to understand what research and evaluations suggest about what schools can do to move the needle on student attendance. As researchers work toward understanding the impact of different interventions and practices, and how results vary by grade level, on-the-ground experiences in schools highlight the pervasive use of incentives from pre-K to grade 12. Schools have employed a wide range of incentives to improve attendance, with varied levels of success. Unfortunately, there is little guidance on what policymakers and practitioners ought to consider when deciding if incentives are an appropriate intervention, and then how to design incentives in ways that align with the nature of specific attendance barriers and problems. This article presents a framework to fill that gap. We outline the design considerations when creating attendance incentives and offer guidance to practitioners deciding what to implement in their school.”

Note: REL Midwest was unable to locate a link to the full-text version of this resource. Although REL Midwest tries to provide publicly available resources whenever possible, it was determined that this resource may be of interest to you. It may be found through university or public library systems.

Faria, A.-M., Sorensen, N., Heppen, J., Bowdon, J., Taylor, S., Eisner, R., et al. (2017). Getting students on track for graduation: Impacts of the Early Warning Intervention and Monitoring System after one year (REL 2017-272). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Midwest. Retrieved from

From the ERIC abstract: “Although high school graduation rates are rising—the national rate was 82 percent during the 2013/14 school year (U.S. Department of Education, 2015)—dropping out remains a persistent problem in the Midwest and nationally. Many schools now use early warning systems to identify students who are at risk of not graduating, with the goal of intervening early to help students get back on track for on-time graduation. Although research has guided decisions about the types of data and indicators used to flag students as being at risk, little is known about the impact of early warning systems on students and schools—and in particular, whether these systems do help get students back on track. This study, designed in collaboration with the REL Midwest Dropout Prevention Research Alliance, examined the impact and implementation of one early warning system—the Early Warning Intervention and Monitoring System (EWIMS)—on student and school outcomes. To assess the impact of EWIMS on student and school outcomes, 73 high schools in three Midwest Region states were randomly assigned to implement EWIMS during the 2014/15 school year (37 EWIMS schools) or to continue their usual practices for identifying and supporting students at risk of not graduating on time and to delay implementation of EWIMS until the following school year (36 control schools). The study included 37,671 students in their first or second year of high school, with 18,634 students in EWIMS schools and 19,037 students in control schools. EWIMS and control schools and students were similar on all background characteristics prior to random assignment. The study examined the impacts of EWIMS on indicators of student risk and on student progress in school after the first year of EWIMS adoption. The study found that EWIMS reduced the percentage of students with risk indicators related to chronic absence and course failure but not related to low GPAs or suspension: (1) The percentage of students who were chronically absent (missed 10 percent or more of instructional time) was lower in EWIMS schools (10 percent) than in control schools (14 percent); this 4 percentage point difference was statistically significant; and (2) The percentage of students who failed one or more courses was lower in EWIMS schools (21 percent) than in control schools (26 percent); this 5 percentage point difference was statistically significant; (3) The percentage of students who had a low GPA (2.0 or lower) was 17 percent in EWIMS schools and 19 percent in control schools; this difference was not statistically significant. However, sensitivity analyses that used continuous GPA data instead of the binary risk indicator showed that, on average, GPAs were higher in EWIMS schools (2.98) than in control schools (2.87); this difference was statistically significant; and (4) The percentage of students who were suspended once or more was 9 percent in both EWIMS and control schools; there was no statistically significant difference. EWIMS did not have an impact on student progress in school. That is, there was not a statistically significant difference between EWIMS and control schools in the percentage of students who earned insufficient credits to be on track to graduate within four years (14 percent in both). At the school level, EWIMS did not have a detectable impact on school data culture, that is, the ways in which schools use data to make decisions and identify students in need of additional support. In nearly all participating schools, overall implementation of the EWIMS seven-step process was low, and implementation was challenging. Nevertheless, EWIMS schools were more likely than control schools to report using an early warning system and having a dedicated team to identify and support at-risk students, but EWIMS schools did not differ from control schools in the frequency of data review or the number and type of interventions offered. This report provides rigorous initial evidence that even with limited implementation during the first year of adoption, using a comprehensive early warning system can reduce the percentage of students who are chronically absent or who fail one or more courses. These short-term results are promising because chronic absence and course failure in grades 9 and 10 are two key indicators that students are off track for on-time graduation. However, because the past research linking indicators to on-time graduation is correlational, it is not yet known if improving these indicators leads to improving on-time graduation rates. Also, EWIMS did not have a detectable impact on other measured indicators that are related to students’ likelihood of on-time graduation, including low GPAs, suspensions, and earning insufficient credits. Future research is needed to better understand the mechanisms through which EWIMS had an impact on chronic absence and course failure and why EWIMS did not affect other outcomes. In particular, studies could focus on identifying which staff actions and student experiences lead to improved student outcomes. Studies should also examine whether schools achieve improved overall implementation in subsequent years and whether (and how) the observed impacts fade, grow larger, or extend to other risk indicators (low GPAs and suspensions); to intermediate outcomes (including student persistence and progress in school); and to long-term outcomes (including dropout and on-time graduation rates). The following are appended: (1) Planned implementation of the Early Warning Intervention and Monitoring System; (2) Recruitment, random assignment, and study sample; (3) Data collection and analytic methods; (4) Detailed findings and supplementary analyses; and (5) Disclosure of potential conflicts of interest.”

Gee, K. A. (2018). Minding the gaps in absenteeism: Disparities in absenteeism by race/ethnicity, poverty and disability. Journal of Education for Students Placed at Risk (JESPAR), 23(1–2), 204–208. Retrieved from

From the ERIC abstract: “Children from certain racial and ethnic minority backgrounds, in poverty, and/or with a disability, often face distinct challenges in attending school, leading them to miss more school relative to their non-minority, more socio-economically advantaged and non-disabled peers. This brief describes these disparities in absenteeism in the US, discusses the challenge of explaining these disparities, and considers the implications that disparities have for addressing absenteeism. Finally, it closes with advice on how schools can make headway in reducing disparities in absenteeism. As this brief argues, while schools can readily document absenteeism gaps, diagnosing the root causes of these gaps remains much more elusive. Further, schools seeking to reduce disparities in absenteeism will not only need to intentionally establish explicit targets to reduce such gaps, but they will need to develop individualized strategies to remove barriers to attendance thereby getting children—especially those facing disproportionate challenges—back into the classroom.”

Note: REL Midwest was unable to locate a link to the full-text version of this resource. Although REL Midwest tries to provide publicly available resources whenever possible, it was determined that this resource may be of interest to you. It may be found through university or public library systems.

Ginsburg, A., Jordan, P., & Chang, H. (2014). Absences add up: How school attendance influences student success. San Francisco, CA: Attendance Works. Retrieved from

From the ERIC abstract: “This state-by-state analysis of national testing data demonstrates that students who miss more school than their peers consistently score lower on standardized tests, a result that holds true at every age, in every demographic group, and in every state and city tested. The analysis is based on the results of the 2013 National Assessment of Educational Progress (NAEP). It compares attendance rates and NAEP scores for every state and for 21 large urban areas.”

Jordan, P. W., & Miller, R. (2017). Who’s in: Chronic absenteeism under the Every Student Succeeds Act. Washington, DC: FutureEd at Georgetown University. Retrieved from

From the introduction: “To help policymakers navigate the challenges of including chronic absenteeism in their school measurement systems, FutureEd has done a comprehensive review of the research on student absenteeism, analyzed the absenteeism provisions in all 51 state ESSA plans submitted to or drafted for the U.S. Department of Education as of the department’s Sept. 18 deadline, and conducted a fresh analysis of federal chronic absenteeism data. Drawing on this research, this report provides a roadmap for leveraging ESSA to keep more students in school and on a path to academic success.”

Perry, M., Gottfried, M., Young, K., Colchico, C., Lee, K., & Chang, H. (2019). Approaches to reducing chronic absenteeism. Stanford, CA: Policy Analysis for California Education. Retrieved from

From the ERIC abstract: “Acknowledging the importance of students simply being in school, California has made student attendance part of its accountability system. This brief covers a session in which it was pointed out that using chronic absenteeism as an accountability measure is new and its underlying causes are not well understood. Even as many schools face the expectation that they take action to address high rates of absenteeism, myths about school attendance persist. The brief includes examples of local efforts to improve student attendance and discusses steps needed to build the capacity of schools and communities to get kids to school and keep them there.”

Robinson, C. D., Lee, M. G., Dearing, E., & Rogers, T. (2018). Reducing student absenteeism in the early grades by targeting parental beliefs. American Educational Research Journal, 55(6), 1163–1192. Retrieved from

From the ERIC abstract: “Attendance in kindergarten and elementary school robustly predicts student outcomes. Despite this well-documented association, there is little experimental research on how to reduce absenteeism in the early grades. This paper presents results from a randomized field experiment in 10 school districts evaluating the impact of a low-cost, parent-focused intervention on student attendance in grades K-5. The intervention targeted commonly held parental misbeliefs undervaluing the importance of regular K-5 attendance as well as the number of school days their child had missed. The intervention decreased chronic absenteeism by 15%. This study presents the first experimental evidence on how to improve student attendance in grades K-5 at scale and has implications for increasing parental involvement in education.”

Stuit, D., O’Cummings, M., Norbury, H., Heppen, J., Dhillon, S., Lindsay, J., et al. (2016). Identifying early warning indicators in three Ohio school districts (REL 2016-118). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Midwest. Retrieved from

From the ERIC abstract: “In partnership with the Midwest Dropout Prevention Research Alliance the study team used student-level data and a five-step process to identify the most accurate indicators of students’ failure to graduate from high school on time. Student-level data came from attendance records, transcripts, and discipline records of grade 8 and 9 students in three Ohio school districts. The study found that the most accurate early warning indicators of students being off track for graduating on time vary by school district and grade level. Overall, the most accurate indicators in both grades were based on coursework (grade point average and number of credits earned). On average, indicators were more accurate in grade 9 than in grade 8. Other districts may be able to use the methods described in this report to identify early warning indicators for their grade 8 and 9 students.”

U.S. Department of Education. (2019). Chronic absenteeism in the nation’s schools: A hidden educational crisis. Washington, DC: Author. Retrieved from

From the introduction: “Students who are chronically absent—meaning they miss at least 15 days of school in a year—are at serious risk of falling behind in school. Yet, for too long, this crisis in our nation’s public elementary and secondary schools has not been fully understood. Now, under the Every Student Succeeds Act, many states are reporting chronic absenteeism data annually. This data story, updated with the 2015–16 Civil Rights Data Collection (CRDC), bolsters efforts to reduce and ultimately eliminate chronic absenteeism so that all students have a better chance of reaching their full potential. The data from the CRCD is drawn from nearly every public school in the country and helps us understand who is chronically absent, at what grade levels chronic absenteeism tends to occur, and how chronic absenteeism compares community-by-community and state-by-state.”

U.S. Department of Education, National Center for Education Statistics. (2011). Average daily attendance (ADA) as a percentage of total enrollment, school day length, and school year length in public schools, by school level and state: 2003-04 and 2007-08. Washington, DC: Author. Retrieved from


Keywords and Search Strings

The following keywords and search strings were used to search the reference databases and other sources:

  • “Average daily attendance” Wisconsin

  • “Attendance patterns”

  • “Attendance patterns” Midwest

Databases and Search Engines

We searched ERIC for relevant resources. ERIC is a free online library of more than 1.6 million citations of education research sponsored by the Institute of Education Sciences (IES). Additionally, we searched IES and Google Scholar.

Reference Search and Selection Criteria

When we were searching and reviewing resources, we considered the following criteria:

  • Date of the publication: References and resources published over the last 15 years, from 2005 to present, were included in the search and review.

  • Search priorities of reference sources: Search priority is given to study reports, briefs, and other documents that are published or reviewed by IES and other federal or federally funded organizations.

  • Methodology: We used the following methodological priorities/considerations in the review and selection of the references: (a) study types—randomized control trials, quasi-;experiments, surveys, descriptive data analyses, literature reviews, policy briefs, and so forth, generally in this order, (b) target population, samples (e.g., representativeness of the target population, sample size, volunteered or randomly selected), study duration, and so forth, and (c) limitations, generalizability of the findings and conclusions, and so forth.
This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Midwest Region (Illinois, Indiana, Iowa, Michigan, Minnesota, Ohio, Wisconsin), which is served by the Regional Educational Laboratory (REL Midwest) at American Institutes for Research. This memorandum was prepared by REL Midwest under a contract with the U.S. Department of Education’s Institute of Education Sciences (IES), Contract ED-IES-17-C-0007, administered by American Institutes for Research. Its content does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.