All SEAs are required to evaluate LEA and SIG-awarded schools' progress on an annual basis to determine if SIG funding should continue. SEAs are allowed, but not required, to monitor LEAs more frequently than once a year. State officials reported that they intend to monitor either weekly, monthly, quarterly, twice each year or annually (see Exhibit 7).
SEAs reported a variety of strategies to monitor LEAs' progress in implementing SIG intervention models. Strategies include in-person site visits, designation of staff assigned to specific LEAs or schools, and online tools (see Exhibit 8).
Site visits. Thirty-nine states plan to conduct site visits to monitor progress toward SIG goals at some point during the SIG implementation process. For example, the Arizona SEA requires school improvement specialists from the SEA to conduct monthly site visits to SIG-funded LEAs. These specialists must use an implementation checklist based on LEA priorities identified in the online school turnaround implementation plans. In California, representatives from the state and regional consortia intend to conduct site visits to a "selected representative sample" of LEAs and schools in order to validate information submitted by LEAs and to gather additional information from interviews and observations.
Designated staff. Thirty states intend to designate specific staff from the SEA, regional offices, LEA or external providers as responsible for monitoring the progress of SIG schools. For example, in South Dakota, a state Title I staff member assigned to each LEA will be responsible for providing monthly monitoring reports to the SEA and conducting regular conference calls and site visits. In Indiana, the SEA plans to assign each school to an external provider who will be responsible for monitoring and reporting progress.
Check-in meetings. Sixteen states plan to "check-in" with LEAs to obtain progress reports and identify challenges prior to the annual renewal of the SIG funds. These check-in meetings are less formal than site visits, involve fewer staff, and may not include face-to-face meetings. For example, in Idaho, personnel from the Student Achievement and School Improvement Division of the SEA will conduct conference calls and in-person meetings with key LEA and school leaders to monitor progress. In Maine, the SEA requires that a Title I school improvement consultant provide a variety of supports, including quarterly check-in meetings to identify LEA and school needs and monitor implementation.
Online/electronic tools. Sixteen states plan to use online tools and data systems to monitor progress. For example, in Virginia, the SEA will monitor progress and provide feedback to LEAs and SIG schools through the Indistar online school improvement system developed jointly with the Center on Innovation and Improvement. In Oklahoma, the LEAs and SIG schools will use the WISE online planning and coaching tool to monitor progress on a quarterly basis.
ED's Guidance on School Improvement Grants delineates a combination of achievement and leading indicators for Tier I and Tier II SIG schools. These indicators are clustered in the following categories: 1) school data, 2) student outcomes and academic progress, 3) student connection and school climate, and 4) talent (see Exhibit 9).
In addition to these indicators, an SEA may identify additional measures to evaluate a SIG school's progress. All but seven states reported additional monitoring measures to evaluate progress and to determine if SIG funding should continue. In states that added monitoring measures, these focused on assessing implementation progress as opposed to academic outcomes. For example, the Ohio SEA developed an electronic implementation monitoring tool and the Florida SEA established a "Performance Expectations for Intervention Model" flowchart that guides LEAs and schools in how to establish annual performance goals. Some SEAs that added measures developed them based on specific goals in SIG schools' improvement or turnaround plans (15 states) or model-specific implementation goals developed by the state or LEA (12 states and the District of Columbia). Two states (Florida and Louisiana) plan to examine the distribution of effective teachers in an LEA using value-added teacher evaluation models.