Skip Navigation

Home Products New report shares tools and outcomes from an evaluation of a networked improvement community of alternative learning centers

New report shares tools and outcomes from an evaluation of a networked improvement community of alternative learning centers

Midwest | March 15, 2021
New report shares tools and outcomes from an evaluation of a networked improvement community of alternative learning centers

Networked improvement communities (NICs) are collaborative partnerships in which members use continuous improvement methods and quick research cycles to identify a common problem of practice and test and refine solutions in different real-world contexts (Bryk et al., 2015). When the process is well implemented, NICs can accelerate members’ capacity to learn from one another and use data to inform decisions, which in turn can lead to improvements in practice (Bryk et al., 2015; Russell et al., 2017).

Although the use of NICs in education is on the rise, research evidence and guidance on their implementation and outcomes in education settings remains limited. A new report from the Regional Educational Laboratory (REL) Midwest adds to the evidence base and provides tools educators can adapt to evaluate and improve the implementation of NICs in education settings. The report describes the Midwest Career Readiness Research Alliance’s formative evaluation of the Minnesota Alternative Learning Center NIC. REL Midwest and the Minnesota Department of Education, in partnership with five Minnesota learning centers, formed this NIC to increase low student graduation rates by improving the centers’ use of competency-based credit recovery practices. The report includes information on how educators can adapt the evaluation approach and tools to assess other NICs.

>> Read and download the full report and related tools.

Evaluating the implementation of a networked improvement community

To support the Minnesota Alternative Learning Center NIC, REL Midwest researchers developed a framework and tools to evaluate the NIC’s implementation. The team wanted to understand the extent to which NIC participants:

  • Were engaged in the NIC and how relevant and useful they perceived NIC activities to be.
  • Gained the essential knowledge and skills for engaging in continuous improvement through participation in the NIC.
  • Completed continuous improvement milestones, such as identifying a problem of practice, conducting a root-cause analysis, and developing a driver diagram to understand what factors could lead to improved outcomes.

In addition to describing this evaluation and the findings, the report illustrates how other educators can adapt the process and the suite of included tools to collect, analyze, and interpret data on a NIC’s implementation. The results can provide formative feedback on whether a NIC is operating as intended. To use the NIC evaluation tools, staff need basic skills in quantitative and qualitative analysis.

NIC evaluation tools

Explore the table on page 7 and appendices of the report to find the following tools:

  • Evaluation planning checklist: This checklist outlines the steps in planning a successful NIC evaluation. For most data sources, an additional step is to determine a protocol for the secure transfer of data to the evaluation team.
  • Attendance logs: Evaluators can use such logs to track participant attendance at NIC meetings.
  • Postmeeting survey: Using this survey, evaluators can analyze data over time to assess changes in the relevance and usefulness of NIC meetings and to examine variation among NIC groups or sites.
  • Post-Plan-Do-Study-Act cycle survey: NICs often conduct Plan-Do-Study-Act cycles to test and refine solutions to problems of practice. After each cycle, evaluators can administer this brief survey, on which respondents rate their participation in continuous improvement milestones.
  • Continuous improvement artifacts: Evaluators and NIC participants can use this template to create a detailed plan for implementing the NIC’s change idea.
  • Event summaries: Using event summaries, evaluators can provide additional context to support the information provided by other evaluation tools. Event summaries may include detailed notes on what was accomplished at NIC meetings.

How did the evaluation tools help researchers learn about NIC implementation?

Using the NIC evaluation tools, REL Midwest researchers carried out the following strategies to assess whether implementation of the Minnesota Alternative Learning Center NIC was successful. Educators can adapt the tools and strategies to evaluate their own NICs.

  • Measured engagement by observing NIC participation. By using attendance logs at meetings and recording continuous improvement milestones, evaluators found that participant engagement facilitated the effective use of continuous improvement practices.
  • Collected and examined NIC participant feedback related to the relevance and usefulness of NIC activities. Evaluators used the postmeeting survey to understand how participants’ perceptions of the NIC changed over time.
  • Collected and examined feedback from NIC participants related to their knowledge and skills for engaging in continuous improvement. Evaluators also used the postmeeting survey to determine the extent to which participants agreed that their participation in the NIC increased their understanding of the process and activities involved.
  • Examined NIC artifacts to indicate implementation of continuous improvement processes. The Minnesota Alternative Learning Center NIC generated artifacts during the continuous improvement process that evaluators used to track the completion of continuous improvement milestones and the resulting progress toward improving student graduation rates at alternative learning centers.
  • Measured whether improvement efforts used a network approach. The evaluation team used the post-Plan-Do-Study-Act survey and event summaries from coaching sessions to assess the level of coordination in identifying and selecting change ideas for implementation.

The evaluation report includes several study limitations to consider when exploring the findings. Read the full report for more information about these limitations.

Related resources

To learn more about the work of the Minnesota Alternative Learning Center NIC, see REL Midwest’s video on how participants have used data to enact change and a blog post on how participants have used continuous improvement to strengthen credit recovery and graduation rates.

To learn more about the work of the Midwest Career Readiness Research Alliance, browse the materials used in alliance training and coaching sessions related to credit recovery, including an infographic that explores the state of credit recovery in Minnesota public schools.

Read this recent resource roundup on continuous improvement cycles to learn more about how educators can test and refine their classroom strategies and implement NICs. REL Midwest also facilitated a one-day training to build the capacity of the Minnesota Department of Education and Minnesota districts and schools in understanding NIC principles. View the event agenda and slide deck.

Related resources

Bryk, A. S., Gomez, L. M., LeMahieu, P. G., & Grunow, A. (2015). Learning to improve: How America’s schools can get better at getting better. Harvard Education Press. https://eric.ed.gov/?id=ED568744

Engelbart, D. C. (1992, August). Toward high-performance organizations: A strategic role for groupware. Paper presented at the GroupWare 1992 Conference, San Jose, CA, United States. https://www.dougengelbart.org/content/view/116/

Russell, J. L., Bryk, A. S., Dolle, J. R., Gomez, L. M., LeMahieu, P. G., & Grunow, A. (2017). A framework for the initiation of networked improvement communities. Teachers College Record, 119(5), 1–36. https://eric.ed.gov/?id=EJ1144314

Author(s)

Maggi Ibis

Maggi Ibis

Connect with REL Midwest