This blog post describes how REL Appalachia staff developed a data analysis plan and two Excel tools to help Kentucky Department of Education staff streamline their document review process. The tools were designed to support analysis of Multi-tiered System of Supports (MTSS) implementation across the state and to inform technical assistance for districts. The first tool supported the selection of a random sample of documents, and the second provided a template for recording and analyzing information from the sampled documents.
Too much content, too little time
State education agency leaders rely on documentation from districts to understand program implementation, identify and address issues, and support district leaders to improve local implementation. But the more comprehensive the documentation, the more burdensome it is to review. Making sense of a large volume of technical content is challenging for state education agency staff with limited time for in-depth document review and analysis.
The Kentucky Department of Education (KDE) faced this challenge upon receiving over 170 district-specific documents to review. Kentucky legislation requires KDE to collect evidence of implementation of a multi-tiered system of supports (MTSS) from each district annually. In 2025, districts submitted either an MTSS handbook, results of a self-assessment of MTSS implementation, or another type of evidence (for example, MTSS flowcharts, intervention lists, or meeting agendas).
KDE staff needed the submitted information to understand MTSS implementation across the state and to inform technical assistance for districts. But they did not have time to analyze that many documents.
Using random sampling and a template with built-in calculations and filters for summarizing results can simplify complex document reviews.
What we did
REL Appalachia (REL AP) staff met with KDE leaders to learn about the goals of the document review. The REL AP team then developed an Ask an Expert response that included a data analysis plan and two Excel tools to help streamline the document review: (1) a tool for selecting a random sample of documents, and (2) a template for recording and analyzing information in the sampled documents.
Identified a representative sample of documents
When reviewing all documents is infeasible, a carefully selected sample can still inform decisions. KDE needed the documents reviewed to reflect variation in MTSS implementation, regional service provider, district size, and document type. Using random sampling is one way to ensure that, on average, the reviewed documents will be representative of all documents received. Although random sampling ensures representativeness on average, any single sample could overrepresent certain types of documents. To make sure the reviewed documents reflected variation across the state, REL AP grouped documents by document characteristics, selected a random starting point in this grouped list, and stepped through the list in fixed increments, selecting whichever district we landed on. This approach, called implicit stratified sampling, guarantees representativeness across different types of attributes.
KDE staff were uncertain how many documents they would be able to review, so REL AP’s approach also incorporated flexibility in determining the sample size. The REL AP team provided a randomly sorted list of all districts, marking natural stopping points after every nine or 10 documents on the list, points at which all types of documents had an equal chance of being selected. KDE staff could then review the districts’ documents in the order listed and, when they reached capacity, could stop reviewing at the next stopping point. Larger samples provide greater confidence in the findings, and the tool included the margin of error realized at each stopping point to help KDE decide how many documents to review.
Used a structured template to organize and analyze documents
KDE staff wanted support organizing the information in the sampled documents to make the findings actionable. To ensure alignment with the state’s MTSS framework, REL AP and KDE agreed to organize the review around six essential MTSS elements defined in the Kentucky Multi-Tiered System of Supports Implementation Guide: (1) collaborative problem-solving teams; (2) data-based decisionmaking; (3) tiered delivery system with a continuum of supports; (4) evidence-based instruction, intervention, and support; (5) equitable access and opportunity; and (6) family, school, and community partnerships. Each element is disassembled into features. For example, features of the data-based decisionmaking element include using evidence-based diagnostic tools and progress-monitoring tools. REL AP designed a template organized around these features, where reviewers score documents on whether evidence of the feature is present or not. The template automatically calculates a summary score for each element and an overall MTSS implementation score.
Columns represent features of MTSS, color-coded by element for ease of recording and interpretation. The bold line under District H represents a stopping point as described above. 1 means there is evidence of the feature; 0 means there is no evidence of the feature.
The template also enables KDE staff to examine different features of MTSS implementation across the state. It includes built-in calculations and filtering so staff can easily examine how results vary by document type, region, or grade band, for example. This functionality allows KDE to identify strengths and gaps in MTSS implementation and tailor technical assistance and other supports for districts.
Worked as a team to ensure confidence in the findings
The template offers a streamlined approach to summarizing evidence across documents, but document reviews can be subjective. Two individuals can review the same document and reach different conclusions. Also, before the template was used, we didn’t know if the MTSS features were sufficiently clearly defined or how well the rating scale would work.
To address these concerns, the REL AP team recommended that two KDE staff members independently code the same subsample of documents and then identify areas of agreement and disagreement. Double-coding increases confidence in the credibility of the findings by addressing potential individual biases and testing the clarity of the template. Where coders’ ratings differ, the coders should discuss and come to an agreement about the rating. This process could lead to modifying the template (e.g., clarifying wording) and documenting decision rules for reviewing the whole sample (e.g., if-then statements).
From overwhelmed to informed: What makes it possible
Carefully designed processes can help state agencies gain useful insights from large volumes of documents. Starting with clear questions and grounding the review in an established framework help ensure that findings are meaningful and actionable. Thoughtful sampling approaches, such as implicit stratified random sampling, help ensure that the documents reviewed reflect the variety of districts and document types across the state. Through partnerships like this one with KDE, REL AP helps state education agencies design practical, rigorous approaches that turn complex tasks into useful information.
Resources
1 Jones, S. (2025). Mastering the art of a document review: Turning data into insights. Eval Academy. https://www.evalacademy.com/articles/the-art-of-a-document-review