Skip Navigation
Skip Navigation

Back to Ask A REL Archived Responses

REL Midwest Ask A REL Response

October 2017

Question:

What resources are available on how to implement cross-functional teams to support district improvement, and alignment of SEA and district programs and strategies?



Response:

Following an established Regional Educational Laboratory (REL) Midwest protocol, we conducted a search for research reports and descriptive studies on how state education agency (SEA) cross-functional teams can support district improvement and align SEA and school district programs and strategies. For details on the databases and sources, keywords and selection criteria used to create this response, please see the Methods section at the end of this memo.

Below, we share a sampling of the publicly accessible resources on this topic. The search conducted is not comprehensive; other relevant references and resources may exist. We have not evaluated the quality of references and resources provided in this response, but offer this list to you for your information only.

Research References

French, D., Miles, K. H., & Nathan, L. (2014). The path forward: School autonomy and its implications for the future of Boston’s Public Schools. Boston, MA: Boston Foundation. Retrieved from https://eric.ed.gov/?id=ED560078

From the ERIC abstract: “This study explores the question of how Boston Public Schools (BPS) can strengthen and support autonomy and accountability across its portfolio to promote innovation and expand access to equity and high performance. Some of the specific questions guiding this work are: (1) Should all schools within BPS operate within autonomous structures? (2) Is autonomy a necessary, but not sufficient, condition for success? (3) How and under what conditions should autonomy be granted? (4) Should autonomy be withdrawn based on certain conditions? (5) In what areas should autonomy be granted (governance, curriculum/assessment, scheduling calendar, staffing, budget, professional development)? This is a study on the role and impact of autonomy for school leaders and their teams across the system. Many of the highest-performing schools are traditional schools. Many of the highest performing schools are autonomous schools. The goal is to outline a vision for if, how, and when school autonomy can be used as a tool to help eliminate achievement gaps and improve outcomes for all students. The obligation is to ensure that BPS determines and then provides the conditions for success in all schools so that every student in Boston achieves to his or her highest potential. Effective autonomy must also be paired with accountability. BPS then must establish clearly- defined roles and boundaries for schools and central offices alike. From this research BPS will develop recommendations to help create the conditions for success in all of the District’s schools to serve all students and families well in the decades to come. Recommendations offered in this report include: (1) Establish the district’s vision as a ‘system of schools’ with consistent high expectations, support and accountability for performance; (2) Extend maximum flexibility to all district schools, and encourage any school that is ready and has capacity to pursue adopting an autonomous schools model; (3) Decentralize non-core central services to the maximum extent feasible, and transition to a purchased services model for the remaining non-core central services; (4) Create a cabinet-level Office of Innovation, reporting to the Superintendent, to incubate and oversee development of new school designs and conversions to autonomous school models, and scale currently successful autonomous school designs based on community needs and demands; (5) Cultivate and support leaders and leadership teams to effectively use their flexibilities to make wise resource decisions that enable school and student improvement; (6) Further construct and implement a school accountability model for all district schools that emphasizes effective practice and student success, with clear supports and consequences based on school performance; and (7) Prioritize candidates for the Superintendent position who are committed to sustaining a system of high-performing schools that balances autonomy and accountability, and who bring a track record of uniting people in a culture that values collaboration, leadership and performance. If Boston acts on the recommendations in this report, the belief is that: (1) Schools will be empowered to more strategically organize resources to drive student learning; (2) The system will embody a diversity of programs that reflect the diversity of Boston’s communities; (3) The system will be better able to develop, evaluate and scale innovative practices; (4) Teachers will feel more ownership over instruction, be empowered via shared decision-making and grow as leaders in their schools; and (5) Leadership capacity will increase through formal and informal professional development.”

Goertz, M. E., Barnes, C., Massell, D., Fink, R., & Francis, A. T. (2013). State education agencies’ acquisition and use of research knowledge in school improvement strategies. (CPRE Research Report #RR-77). Philadelphia, PA: Consortium for Policy Research in Education. Retrieved from https://eric.ed.gov/?id=ED547655

From the ERIC abstract: “Over the last 20 years, state education agencies (SEAs) have been given considerably more responsibilities for directing and guiding the improvement of low-performing schools. At the same time, federal policies strongly pressed SEAs to use research to design these supports. Very few studies have explored the SEA as an organization, or its role in accessing and using research. Likewise, few, if any, have studied the role of social networks in the organization and flow of information in SEAs. This exploratory study was designed to fill those gaps by examining where and how a purposive sample of three SEAs searched for, incorporated, and used research and other types of knowledge to design, implement, and refine state school improvement policies, programs and practices.”

Hale, S., Dunn, L., Filby, N., Rice, J., & Van Houten, L. (2017). Evidence-based improvement: A guide for states to strengthen their frameworks and supports aligned to the evidence requirements of ESSA. San Francisco, CA: WestEd. Retrieved from https://eric.ed.gov/?id=ED573213

From the ERIC abstract: “One of the broad intents of the Elementary and Secondary Education Act (ESEA) as amended by the Every Student Succeeds Act (ESSA) is to encourage evidence-based decision-making as a way of doing business. Nonregulatory guidance issued in September 2016 by the U.S. Department of Education (ED) clarifies and expands on both the nature of evidence-based improvement and the levels of evidence that are specified in the law. This guide builds on that ED guidance and provides an initial set of tools to help states and districts understand and plan for implementing evidence-based improvement strategies. This guide recognizes school and district improvement as a continuous, systemic, and cyclical process, and emphasizes the use of evidence in decision-making throughout continuous improvement. In other words, the guide is not aimed at isolated decisions; rather, it is meant to support evidence-based decision-making that is nested within a larger improvement process. The primary audience for this guide is state education agency (SEA) staff who are responsible for understanding and implementing the evidence-based provisions of ESSA. The purpose of the guide is to build capacity of SEAs and their intermediaries to support LEAs in understanding the evidence-related requirements of ESSA and, consequently, selecting and implementing interventions that are evidence-based and that have strong potential to improve student outcomes. Specifically, the guide is intended to: (1) increase readers’ understanding of the expectations and opportunities for evidence-based school and district improvement in the context of ESSA; (2) encourage a broad understanding of the elements of evidence-based decision-making, including how needs, context, implementation strategies, desired outcomes, and sustainability considerations inform choices of evidence-based interventions, and how formative and summative evaluation are integral to an evidence-based improvement cycle; and (3) offer guiding information and a starter set of six tools to support this work, with an emphasis on the process of selecting evidence-based interventions. The materials presented in the guide offer SEAs and their LEAs opportunities to conduct a review of their approach to school and district improvement, including selection of evidence-based interventions, and to develop action steps for strengthening the guidance and supports that SEAs offer to their LEAs and that LEAs offer to their schools. In addition to Section 1: Overview, the following four sections provide further background, tools, and additional resources: (1) Section 2 includes further discussion of the context and requirements of ESSA in relation to evidence-based decision-making, and describes a framework for a continuous improvement process grounded in evidence-based decision-making; (2) Section 3 gives suggestions on how to use the tools in the guide, including information about facilitation strategies and options for modifying the tools to fit state and local contexts. This section also emphasizes the importance of preparing for using the tools; (3) Section 4 provides six tools, each designed to encourage focused conversations and support cross-agency collaboration. The first two tools guide examination of state and district improvement and decision-making frameworks. The second two tools help SEAs and LEAs explore strategies for providing guidance on selecting evidence-based interventions. The last two tools support selection of evidence-based interventions; and (4) Section 5 offers a list of additional resources to further the conversation, and enhance the work, initiated by this guide. This section includes examples of publicly available tools for evidence-based improvement, and sources for research and information on evidence-based interventions. This guide was specifically designed to be a starting point for making evidence-based decisions, and is not intended to be comprehensive. It contains initial information and tools to guide conversations and foster deeper thinking around evidence-based decision-making, especially within an improvement process. The following six tools are provided: (1) SEA Inventory of Current Practice guides a state education agency (SEA) to take stock of its current continuous improvement practice, especially around evidence-based decision-making; (2) LEA Inventory of Current Practice is similar to the above tool, but designed for local education agencies (LEAs); (3) SEA Guidance for Evidence-Based Interventions helps a state to reflect on how it will provide guidance to LEAs on evidence-based interventions; (4) LEA Guidance for Evidence-Based Interventions is similar to the tool above, but designed for LEAs; (5) Intervention Evidence Review guides the review and comparison of interventions that target an identified need; and (6) Comparing Evidence-Based Interventions guides the determination about the degree to which a particular intervention aligns with a given context.”

Jacobson, D. (2016). Building state P-3 systems: Learning from leading states. (Policy Report). New Brunswick, NJ: Center on Enhancing Early Learning Outcomes. Retrieved from https://eric.ed.gov/?id=ED570434

From the ERIC abstract: “The first eight years of life, beginning before birth and continuing through third grade, are a critical developmental period that sets the stage for future success. Research over the past 15 years has demonstrated the importance of high-quality care and education throughout the prenatal-through-third-grade (P-3) continuum, including prenatal and infant and toddler care, preschool education, and early elementary education. The programs and services provided to young children and their families during these early years are typically highly fragmented in most communities in the United States, the result of a multiplicity of funding streams and the wide variety of early education settings, services, and professional roles that characterize the mixed-delivery system in the United States. Communities, states, and the federal government are all working to improve quality and coherence across the P-3 continuum. This report provides three case studies to address a central question: How can states support P-3 system building at both state and local levels? The three case-study states—Massachusetts, Oregon, and Pennsylvania—were chosen based on their experience implementing P-3 state policies and developing significant grant programs to fund regional and local P-3 partnerships. The case studies profiled in this report demonstrate the crucial roles State Education Agencies (SEAs) can play in supporting P-3 system building—both through state policy as well as by supporting local and regional early learning partnerships. Carrying out this work requires that SEAs align their work internally across divisions and units while building the capacity of communities to design and implement quality improvement and alignment activities.”

McGuinn, P. (2015). State education agencies and the implementation of new teacher evaluation systems. (CPRE Policy Brief. PB #15-2). Philadelphia, PA: Consortium for Policy Research in Education. Retrieved from https://eric.ed.gov/?id=ED564261

From the ERIC abstract: “It has been three years since Race to the Top grant-winning states piloted new teacher evaluation systems and many of them have made considerable progress, yet according to media coverage and a Government Accountability Office (GAO) report published in April 2015, struggles remain and most grantees have asked to extend the timetables for completing this work. Given the enormous importance and complexity of these reforms—and the fact that states vary widely in the timing, approach, and success of their implementation work—this is an excellent opportunity to assess the progress that has been made and identify where challenges persist. It is imperative that states learn from one another during this implementation stage, and this brief serves to facilitate the discussion by highlighting what is and is not working in the Race to the Top states.”

Proger, A. R., Bhatt, M. P., Cirks, V., & Gurke, D. (2017). Establishing and sustaining networked improvement communities: Lessons from Michigan and Minnesota (REL 2017- 264). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Midwest. Retrieved from https://eric.ed.gov/?id=ED573419

From the ERIC abstract: “There is growing interest in the ability of improvement science—the systematic study of improvement strategies to identify promising practices for addressing issues in complex systems (Improvement Science Research Network, 2016) — to spur innovation and address complex problems. In education this methodology is often implemented through collaborative research partnerships in which researchers and practitioners work together to systematically test and refine theories of change in real-world settings. A networked improvement community is a collaborative research partnership that uses the principles of improvement science within networks of organizations to learn from varied implementation of new ideas across contexts. While the central work of a networked improvement community is to identify a specific and actionable problem and collectively address it through an iterative process of designing, implementing, testing, and redesigning promising new practices, the learning from these iterative cycles can be brought back and applied to the local contexts of the networked improvement community participants (such as classrooms, districts, and states), potentially affecting education practices more widely. Although there is practical guidance for how networked improvement communities should structure this work, few published accounts describe the process of forming a networked improvement community. This report describes the process of forming networked improvement communities in Michigan and Minnesota after state education agency leaders requested assistance from Regional Educational Laboratory (REL) Midwest to support state-led efforts to use improvement science to raise student achievement and narrow achievement gaps in schools with the widest achievement gaps (focus schools). The resulting collaborations led to the establishment of two networked improvement communities during the 2015/16 school year, one in Michigan and one in Minnesota, focused on improvement in schools identified as needing support under their accountability systems. The REL Midwest project team used guidance from the literature and other improvement science efforts (for example, Bryk, Gomez, Grunow, & LeMahieu, 2015) to direct its activities. Each networked improvement community has a slightly different history and emphasis. The Michigan Focus Networked Improvement Community works across five focus schools—schools with the largest achievement gaps—in two districts to address disparities in student achievement within schools. The two districts are each part of an intermediate school district, a regional education service agency that provides consolidated support services to districts in an assigned service area and thereby plays an important role in providing professional development and supporting pilot programs in districts. Participants in the Michigan Focus Networked Improvement Community include state education agency representatives, intermediate school district administrators, district representatives, and focus school principals. The Minnesota Statewide System of Support Networked Improvement Community seeks to improve state supports to six Regional Centers of Excellence that serve focus schools. In Minnesota, the Cross-agency Implementation Team oversees the implementation of the statewide system of support. Its members include leadership and content specialists from both the Minnesota Department of Education and the Regional Centers of Excellence; they also serve as participants in the networked improvement community. The goal of establishing both networked improvement communities was twofold: to expose the state education agencies to a process that could be used to scale initiatives and to engage agencies at a level that would leave them able to use the process with other initiatives. Networked improvement community participants are now focusing on sustainability, using what they learned in the first year as the foundation for maintaining key processes and functions. This report aims to guide other researchers, state education agency leaders, and district leaders as they establish networked improvement communities in different contexts.”

Redding, S., & Nafziger, D. (2013). Functional coherence in the state education agency: A structure for performance management. Solutions: Building State Capacity and Productivity Center at Edvance Research, 4. Retrieved from https://eric.ed.gov/?id=ED559703

From the ERIC abstract: “The purpose of the state education agency (SEA) is to focus the entire education system on helping students become capable in college and career in an increasingly complex world. One of the most vexing problems facing SEAs today is how to meet increasing demands for performance while adjusting to significant resource reductions. Meeting that demand is complicated, because SEAs sit at the center of a sprawling array of institutions and organizations that each have a role in educating students. Meeting the demand requires that an SEA not only become more effective and productive in its own work, but that it will stimulate the same in local education agencies and other organizations that provide education services. This paper addresses one aspect of that challenge—creating an organizational structure that fosters a coherent and powerful system that continuously improves outcomes. The discussion of organizational function and structure also provides a framework for establishing a performance management system for SEAs.”

Shah, R. (2011). From compliance to service: Evolving the state role to support district data efforts to improve student achievement. Washington DC: Data Quality Campaign. Retrieved from https://eric.ed.gov/?id=ED535225

From the ERIC abstract: “As a result of state, national and federal leadership and political will, states have dramatically increased their capacity to collect robust longitudinal education data. However, without an equally ambitious effort to ensure access and build stakeholders’ capacity to use data to increase student achievement, these infrastructure investments cannot be fully realized. Because districts are the agents that directly affect teaching and learning, states cannot succeed in this evolution in policy and practice unless they actively engage their districts. This engagement requires state education agencies to evolve from their traditional role of primarily ensuring compliance with state and federal laws to a new role as service providers that meet the diverse needs of all districts in the state.”

Strickling, L. R., & Doneker, K. L. (2014). Choreographing partnerships within an organizational structure of accountability: Maryland State Department of Education’s shift from compliance monitor to breakthrough partner. Metropolitan Universities, 25(2), 27–41. Retrieved from https://eric.ed.gov/?id=EJ1092777

From the ERIC abstract: “Drawing upon data from twenty-five interviews, this paper examines how the Maryland State Department of Education’s Cross-functional Team navigates its changing role from compliance monitor to breakthrough partner in terms of discourse, time, and flexibility, as it carries out the work of the Breakthrough Center. It also examines how the role of accountability has shaped the emerging partnership between the cross-functional team and the researchers at CAIRE (Center for Application and Innovation Research in Education).”

U.S. Department of Education, Office of Safe and Drug-Free Schools, Character Education and Civic Engagement Technical Assistance Center. (2008). Partnerships in character education state pilot projects, 1995-2001: Lessons learned. Washington, DC: Author. Retrieved from https://eric.ed.gov/?id=ED502099

From the ERIC abstract: “Character includes the emotional, intellectual and moral qualities of a person or group as the demonstration of these qualities in prosocial behavior. Character education is an inclusive term encompassing all aspects of how schools, related social institutions and parents can support the positive character development of children and adults. Character education teaches the habits of thought and deed that help people live and work together as families, friends, neighbors, communities and nations. The results presented in this report are the lessons learned as educators, parents and communities implemented character education in schools across the states. Knowing what the states did during the Pilot Project to support implementation efforts provides important background as additional state education agencies (SEAs) and local education agencies (LEAs) go forward with the Department’s current support for character education. What emerged from this process was evidence of a high degree of agreement among the projects—not only on what they tried to do, but how they tried to accomplish it, and what impact it had. For example, every project used professional development of staff as an essential means to achieving the goals of the project. This report provides: (1) background information regarding the importance of character education in schools; (2) key findings and trends as reported by the state pilot projects, including goals, successful practices and challenges; and (3) recommendations based on the reports from the states. The State Roll Call section provides a state-by-state summary of each pilot project, which often provides details about specific challenges or effective program components. Finally, the appendix displays an analysis of the data reported in a series of illustrations that include project goals, type of project strategy, program approaches, type of project focus, successful implementation factors, type of data collection, materials or resources developed, and sustainability factors.”

Zavadsky, H. (2014). State education agency communications process. Benchmark and Best Practices Project. Building State Capacity and Productivity Center at Edvance Research, 1. Retrieved from https://eric.ed.gov/?id=ED559685

From the ERIC abstract: “The role of state education agencies (SEAs) has shifted significantly from low-profile, compliance activities like managing federal grants to engaging in more complex and politically charged tasks like setting curriculum standards, developing accountability systems, and creating new teacher evaluation systems. The move from compliance-monitoring to leading large-scale education change is not only more complex, it also has shined a brighter spotlight on SEAs and their ability to improve student outcomes. Communication is a critical component to help SEAs create system-wide clarity and coherence, and to proactively manage messages to gain stakeholder support. This report explains the major processes associated with strategic communication and details the communication approaches employed by five SEAs to support the adoption of new state standards, a reform that has recently become politically charged by the suggestion that the Common Core state standards represent government overreach. Because the debate has little to do with the actual content or purpose of the standards—to set a common bar that will better prepare students for college and careers—and more about political perspectives, SEAs find they must take extreme care with their communications and approach to developing, adopting, and implementing new standards.”

Methods

Keywords and Search Strings

The following keywords and search strings were used to search the reference databases and other sources:

  • Cross-functional teams

  • Cross-functional descriptor: “partnerships in education”

  • descriptor: “state departments of education” descriptor: “partnerships in education”

  • “cross-functional teams” “State departments of education”

  • “cross-agency implementation team” “State departments of education”

Databases and Search Engines

We searched ERIC for relevant resources. ERIC is a free online library of more than 1.6 million citations of education research sponsored by the Institute of Education Sciences (IES). Additionally, we searched Institute of Education Sciences (IES) and Google Scholar.

Reference Search and Selection Criteria

When we were searching and reviewing resources, we considered the following criteria:

  • Date of the publication: References and resources published over the last 15 years, from 2002 to present, were include in the search and review.

  • Search priorities of reference sources: Search priority is given to study reports, briefs, and other documents that are published or reviewed by IES and other federal or federally funded organizations.

  • Methodology: We used the following methodological priorities/considerations in the review and selection of the references: (a) study types—randomized control trials, quasi-experiments, surveys, descriptive data analyses, literature reviews, policy briefs, and so forth, generally in this order, (b) target population, samples (e.g., representativeness of the target population, sample size, volunteered or randomly selected), study duration, and so forth, and (c) limitations, generalizability of the findings and conclusions, and so forth.
This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Midwest Region (Illinois, Indiana, Iowa, Michigan, Minnesota, Ohio, Wisconsin), which is served by the Regional Educational Laboratory (REL Region) at American Institutes for Research. This memorandum was prepared by REL Midwest under a contract with the U.S. Department of Education’s Institute of Education Sciences (IES), Contract ED-IES-17-C-0007, administered by American Institutes for Research. Its content does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.