Skip Navigation
Stay Up-to-Date:
Skip Navigation

Demystifying ESSA tiers of evidence and the selection of evidence-based practices

Demystifying ESSA tiers of evidence and the selection of evidence-based practices banner image

By David English | Apr. 23, 2019

Author David English, a REL Southwest senior technical assistance consultant, explains how a recent REL Southwest workshop made great strides toward demystifying new provisions of the Every Student Succeeds Act (ESSA) regarding evidence-based practices (EPBs), making them approachable and actionable for local education service providers. English also showcases how a new rubric implemented by the Oklahoma State Department of Education (OSDE) is prioritizing the use of strong, moderate, or promising evidence in school improvement plans—central to ESSA’s new provisions on using EBPs.

Districts and schools nationwide have been developing and implementing school improvement plans for comprehensive (CSI), targeted (TSI), and additional targeted (ATSI) support and improvement schools under the Every Student Succeeds Act (ESSA). ESSA mandates a number of new actions for low-performing schools, such as comprehensive needs assessments (for CSI schools) and the requirement to identify and address “resource inequities” (for CSI and ATSI schools). The requirement that all CSI, TSI, and ATSI schools include at least one evidence-based practice (EBP) demonstrating strong, moderate, or promising evidence (ESSA Evidence Tiers 1, 2, or 3 respectively) in their school improvement plans, in particular, has prompted an increased need for research-informed technical support. While many staff at the state or district level have some experience in using evidence-based practices at the school level, the specific call by ESSA to distinguish between levels of evidence rigor based on technical criteria have led many to seek practical guidance.

Oklahoma is no exception. The Oklahoma State Department of Education (OSDE) has gone so far as to implement a rubric for districts and schools to use when selecting external school improvement providers that prioritizes the use of Tier 1 (strong) and Tier 2 (moderate) evidence. By providing incentives for the use of progressively more rigorous evidence, OSDE has amplified the importance of clearly understanding the distinctions between the tiers. Enter REL Southwest, which is partnering with OSDE to help align OSDE’s 9 Essential Elements of Effective Schools with ESSA’s evidence tier requirements. OSDE’s first priority has been to build the capacity of external providers, such as working “on the ground” with schools and supporting their decisions around the use of EBPs. Building understanding of the ESSA tiers is seen as the first step in this process.

While this a daunting challenge, it does not require that all states and districts become experts on understanding the technical aspects of the tiers. The RELs are here to help. The Department of Education’s ESSA guidance documents make explicit the role of RELs in helping SEAs and LEAs understand and apply the tiers of evidence. So how to boil down a largely technical topic for a nontechnical audience? The REL Southwest team confronted that challenge head-on by kicking off its support effort with a 4-hour workshop for Oklahoma-based external support providers this past November. The team’s first step was to develop a clear checklist of the criteria for meeting evidence tiers that anchored the conversation throughout the workshop. Second, the evidence requirements were framed in the context of OSDE’s school improvement process to help providers understand how evidence level is just one factor that complements other factors in the process for selecting EBPs. The presenters then built understanding of the tiers by scaffolding learning at three levels:

  • Providing a baseline level of technical information on key issues such as the meaning of statistical significance (i.e., “at least a 95% likelihood that the relationship between a practice and outcome is not random”) and the purpose of control and treatment study participant groups;
  • Identifying clues, flags, and visual cues that help evaluate whether an individual research study meets certain technical evidence criteria, such as looking for the asterisks that often denote p-values within results tables to determine statistical significance; and finally
  • Demonstrating shortcuts in online research evaluation clearinghouses (e.g., What Works Clearinghouse) that directly indicate whether particular criteria for evidence tiers have been met.

The results of the workshop were promising. In a concluding discussion, the external providers indicated the workshop had helped demystify a number of otherwise intimidating technical issues and, at a minimum, had provided the tools to begin using online resources to locate evidence-based practices at each of the three tier levels. In general, they certainly felt better prepared to support schools in the process.

For the REL Southwest team, it was affirming that we could convey a relatively technical subject to a nontechnical audience with some success—and without a revolt! By providing scaffolding at multiple entry points (underlying principles, applied clues, and good old-fashioned shortcuts), we gave learners of different levels something to latch onto.

REL Southwest has already started to build on the evidence tier work, by adapting and planning to deliver the training on evidence tiers for districts with Oklahoma CSI and TSI schools in summer 2019. Going forward, the challenge for OSDE’s providers will be to use EBPs that not only meet evidence requirements, but are the right fit for schools in other ways—ensuring proposed EBPs have ample infrastructure and staff support, proper funding, etc. Balancing evidence with these other decision drivers will be a key factor in ensuring that EBPs are implemented in a meaningful way and not just “copied and pasted” from evidence clearinghouses. REL Southwest continues to partner with OSDE to ensure this is the case!