Skip Navigation

REL Central Ask A REL Response

Research Tools

June 2019

Question

How effective is “implementation science” in moving evidence-based practices into schools and classrooms?

Response

Following an established REL Central research protocol, we conducted a search for research reports as well as descriptive study articles to help answer the question. The resources included ERIC and other federally funded databases and organizations, research institutions, academic databases, and general Internet search engines. (For details, please see the methods section at the end of this memo.)

References are listed in alphabetical order, not necessarily in order of relevance. We have not evaluated the quality of the references provided in this response, and we offer them only for your information. Also, we compiled the references from the most commonly used resources of research, but they are not comprehensive and other relevant sources may exist.

Research References

Forman, S. G., Olin, S. S., Hoagwood, K. E., Crowe, M., & Saka, N. (2009). Evidence-based interventions in schools: Developers’ views of implementation barriers and facilitators. School Mental Health, 1, 26–36. Retrieved from https://www.researchgate.net/publication/225593765_Evidence-Based_Interventions_in_Schools_Developers'_Views_of_Implementation_Barriers_and_Facilitators

From the abstract:

“This study examined the factors that are important to successful implementation and sustainability of evidence-based interventions in school settings. Developers of interventions that have been designated as ‘evidence-based’ in multiple vetted lists and registries available to schools participated in a structured interview. The interview focused on potential facilitators and barriers to implementation and sustainability of their intervention. The interviews were transcribed and coded to identify similarities and differences among the responses as well as themes that cut across participants. Results indicated that those concerned with effective implementation and sustainability need to address several areas: (a) development of principal and other administrator support; (b) development of teacher support; (c) development of financial resources to sustain practice; (d) provision of high-quality training and consultation to ensure fidelity; (e) alignment of the intervention with school philosophy, goals, policies, and programs; (f) ensuring that program outcomes and impact are visible to key stakeholders; and (g) development of methods for addressing turnover in school staff and administrators.”


Locke, J., Beidas, R. S., Marcus, S., Stahmer, A., Aarons, G. A., Lyon, A. R., . . . Mandell, D. S. (2016). A mixed methods study of individual and organizational factors that affect implementation of interventions for children with autism in public schools. Implementation Science, 11(135), 1–9. Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-016-0501-8

From the abstract:

Background: The significant lifelong impairments associated with autism spectrum disorder (ASD), combined with the growing number of children diagnosed with ASD, have created urgency in improving school-based quality of care. Although many interventions have shown efficacy in university-based research, few have been effectively implemented and sustained in schools, the primary setting in which children with ASD receive services. Individual- and organizational-level factors have been shown to predict the implementation of evidence-based interventions (EBIs) for the prevention and treatment of other mental disorders in schools, and may be potential targets for implementation strategies in the successful use of autism EBIs in schools. The purpose of this study is to examine the individual- and organizational-level factors associated with the implementation of EBIs for children with ASD in public schools.
Methods: We will apply the Domitrovich and colleagues (2008) framework that examines the influence of contextual factors (i.e., individual- and organizational-level factors) on intervention implementation in schools. We utilize mixed methods to quantitatively test whether the factors identified in the Domitrovich and colleagues (2008) framework are associated with the implementation of autism EBIs, and use qualitative methods to provide a more comprehensive understanding of the factors associated with successful implementation and sustainment of these interventions with the goal of tailoring implementation strategies.
Discussion: The results of this study will provide an in-depth understanding of individual- and organizational-level factors that influence the successful implementation of EBIs for children with ASD in public schools. These data will inform potential implementation targets and tailoring of strategies that will help schools overcome barriers to implementation and ultimately improve the services and outcomes for children with ASD.”


Locke, J., Lawson, G. M., Beidas, R. S., Aarons, G. A., Xie, M., Lyon, A. R., . . . Mandell, D. S. (2019). Individual and organizational factors that affect implementation of evidence-based practices for children with autism in public schools: A cross-sectional observational study. Implementation Science, 14(29), 1–9. Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-019-0877-3

From the abstract:

Background: Children with autism receive most of their intervention services in public schools, but implementation of evidence-based practices (EBPs) for autism varies. Studies suggest that individual (attitudes) and organizational characteristics (implementation leadership and climate) may influence providers’ use of EBPs, but research is relatively limited in this area. This study examined individual and organizational factors associated with implementation of three EBPs–discrete trial training, pivotal response training, and visual schedules–for children with autism in special education classrooms in public elementary schools.
Methods: Participants included 67 autism support teachers and 85 other classroom staff from 52 public elementary schools in the northeastern United States. Participants reported their attitudes toward EBPs (e.g., intuitive appeal, willingness if required, openness, and divergence), implementation leadership and climate of their school, and the frequency with which they deliver each of three EBPs. Linear regression was used to estimate the association of attitudes about EBPs with organizational characteristics and intensity of EBP use. Demographic covariates with a bivariate association with EBP use significant at p < .20 were entered into the adjusted models.
Results: There were significant findings for only one EBP, discrete trial training. Teachers who reported higher perceived divergence (perceived difference of usual practice with academically developed or research-based practices) between EBPs and current practices used less discrete trial training (f2 = .18), and teachers who reported higher appeal (willingness to adopt EBPs given their intuitive appeal) of EBPs used more discrete trial training (f2 = .22). No organizational factors were significantly associated with implementation with any of the three EBPs.
Conclusions: Attitudes toward EBPs may affect teachers’ decisions to use EBPs; however, implementation leadership and climate did not predict EBP use. Future implementation efforts ought to consider the type of EBP and its fit within the context in terms of the EBP’s similarities to and differences from existing practices and programs in the setting. Implementation strategies that target individual attitudes about EBPs may be warranted in public schools.”


Lyon, A. R., Whitake, K., Locke, J., Cook, C. R., King, K. M., Duong, M., . . . Aarons, G. A. (2018). The impact of inter-organizational alignment (IOA) on implementation outcomes: Evaluating unique and shared organizational influences in education sector mental health. Implementation Science, 13(24), 1–11. Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-018-0721-1

From the abstract:

Background: Integrated healthcare delivered by work groups in nontraditional service settings is increasingly common, yet contemporary implementation frameworks typically assume a single organization–or organizational unit–within which system-level processes influence service quality and implementation success. Recent implementation frameworks predict that inter-organizational alignment (i.e., similarity in values, characteristics, activities related to implementation across organizations) may facilitate the implementation of evidence-based practices (EBP), but few studies have evaluated this premise. This study’s aims examine the impact of overlapping organizational contexts by evaluating the implementation contexts of externally employed mental health clinicians working in schools–the most common integrated service delivery setting for children and adolescents. Aim 1 is to estimate the effects of unique intra-organizational implementation contexts and combined inter-organizational alignment on implementation outcomes. Aim 2 is to examine the underlying mechanisms through which inter-organizational alignment facilitates or hinders EBP implementation.
Methods/design: This study will conduct sequential, exploratory mixed-methods research to evaluate the intra- and inter-organizational implementation contexts of schools and the external community-based organizations that most often employ school-based mental health clinicians, as they relate to mental health EBP implementation. Aim 1 will involve quantitative surveys with school-based, externally-employed mental health clinicians, their supervisors, and proximal school-employed staff (total n = 120 participants) to estimate the effects of each organization’s general and implementation-specific organizational factors (e.g., climate, leadership) on implementation outcomes (fidelity, acceptability, appropriateness) and assess the moderating role of the degree of clinician embeddedness in the school setting. Aim 2 will explore the mechanisms through which inter-organizational alignment influences implementation outcomes by presenting the results of Aim 1 surveys to school-based clinicians (n = 30) and conducting semi-structured qualitative interviews. Qualitative data will be evaluated using an integrative inductive and deductive approach.
Discussion: The study aims are expected to identify intra- and inter-organizational constructs that are most instrumental to EBP implementation success in school-based integrated care settings and illuminate mechanisms that may account for the influence of inter-organizational alignment. In addition to improving school-based mental health, these findings will spur future implementation science that considers the relationships across organizations and optimize the capacity of implementation science to guide practice in increasingly complex systems of care.”


Nelson, S. R., Leffler, J. C., & Hansen, B. A. (2009). Toward a research agenda for understanding and improving the use of research evidence. Portland, OR: Northwest Regional Educational Laboratory. Retrieved from https://eric.ed.gov/?id=ED506962
Full text available at https://educationnorthwest.org/sites/default/files/toward-a-research-agenda.pdf

From the ERIC abstract:

“Many researchers and research funders want their work to be influential in educational policy and practice, but there is little systematic understanding of how policymakers and practitioners use research evidence, much less how they acquire or interpret it. By understanding what does shape policymakers' and practitioners' decision making and the role of research evidence in those decisions, the research community may be able to improve the likelihood that their work will be used to directly inform policy and practice. This study sought to contribute to that goal by helping to identify when, how, and under what conditions research evidence is used by policymakers and practitioners; what other sources of information these individuals rely on; and what factors serve as barriers or facilitators to using research evidence in making policy and practice decisions. In shedding light on those topics, the authors hoped to uncover promising areas for future investigation by researchers. The study was conducted in fall 2008 through spring 2009 by the Northwest Regional Educational Laboratory, in collaboration with the Center for Knowledge Use in Education and with the support of the William T. Grant Foundation. The research team used a combination of structured focus groups and individual interviews to elicit comments from a limited, self-selected sample of 65 influential leaders in the areas of policy and practice. Participants represented six groups of federal, state, and local educational interests. The authors’ findings suggest that barriers to the use of research evidence are linked to an underlying belief that much research is not to be trusted or is, at least, severely limited in its potential applicability. Even with studies that meet ‘gold standard’ criteria, participants were aware that a narrowly designed study could report a false success or a false failure. It was a common perception of the study participants that research could be shaped to say anything, that one piece of research often conflicts with another, and that much research is not timely for users’ needs.”


Reinke, W. M., Stormont, M., Herman, K. C., & Newcomer, L. (2014). Using coaching to support teacher implementation of classroom-based interventions. Journal of Behavioral Education, 23, 150–167. Retrieved from https://eric.ed.gov/?id=EJ1038138
Full text available at https://www.researchgate.net/publication/260527641_Using_Coaching_to_Support_Teacher_Implementation_of_Classroom-based_Interventions

From the abstract:

“Despite the growing evidence base for the efficacy of preventive interventions, the level of implementation of these interventions in schools is often less than optimal. One promising approach to supporting teachers in implementation of interventions is the use of coaching. In this study, teachers were trained in a universal classroom management intervention and provided ongoing coaching. The association between the type and amount of coaching activities and teacher implementation of proactive classroom management over time were investigated. Results indicated that teachers who received more performance feedback had higher levels of implementation over time in comparison with teachers who received less feedback. In addition, a significant interaction between the amount of coaching a teacher received and his or her implementation of proactive classroom management was found. Increased implementation over time was observed for teachers with lower initial levels of implementation who received more coaching, whereas implementation decreased over time for teachers who received less coaching. The importance of coaching as a support system for enhancing implementation quality of classroom-based preventive interventions is discussed.”


Swindle, T., Johnson, S. L., Whiteside-Mansell, L., & Curran, G. M. (2017). A mixed methods protocol for developing and testing implementation strategies for evidence-based obesity prevention in childcare: A cluster randomized Hybrid Type III trial. Implementation Science, 12(90), 1–10. Retrieved from https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0624-6

From the abstract:

Background: Despite the potential to reach at-risk children in childcare, there is a significant gap between current practices and evidence-based obesity prevention in this setting. There are few investigations of the impact of implementation strategies on the uptake of evidence-based practices (EBPs) for obesity prevention and nutrition promotion. This study protocol describes a three-phase approach to developing and testing implementation strategies to support uptake of EBPs for obesity prevention practices in childcare (i.e., key components of the WISE intervention).
Methods: Informed by the i-PARIHS framework, we will use a stakeholder-driven evidence-based quality improvement (EBQI) process to apply information gathered in qualitative interviews on barriers and facilitators to practice to inform the design of implementation strategies. Then, a Hybrid Type III cluster randomized trial will compare a basic implementation strategy (i.e., intervention as usual) with an enhanced implementation strategy informed by stakeholders. All Head Start centers (N = 12) within one agency in an urban area in a southern state in the USA will be randomized to receive the basic or enhanced implementation with approximately 20 classrooms per group (40 educators, 400 children per group). The educators involved in the study, the data collectors, and the biostastician will be blinded to the study condition. The basic and enhanced implementation strategies will be compared on outcomes specified by the RE-AIM model (e.g., Reach to families, Effectiveness of impact on child diet and health indicators, Adoption commitment of agency, Implementation fidelity and acceptability, and Maintenance after 6 months). Principles of formative evaluation will be used throughout the hybrid trial.
Discussion: This study will test a stakeholder-driven approach to improve implementation, fidelity, and maintenance of EBPs for obesity prevention in childcare. Further, this study provides an example of a systematic process to develop and test a tailored, enhanced implementation strategy.”




Methods

Search Strings

The following keywords and search strings were used to search the reference databases and other sources:

  • “Evidence-based practices” AND education
  • “Implementation research” AND education
  • “Implementation science” AND education
  • “Research-to-practice” AND education

Databases and Resources

We searched ERIC for relevant resources. ERIC is a free online library of over 1.6 million citations of education research sponsored by the Institute of Education Sciences. Additionally, we searched Google Scholar and Google.

Reference Search and Selection Criteria

When searching and reviewing resources, we considered the following criteria:

  • Date of the Publication: References and resources published between 2009 and 2019 were included in the search and review.
  • Search Priorities of Reference Sources: Search priority was given to ERIC, followed by Google Scholar and Google.
  • Methodology: The following methodological priorities/considerations were used in the review and selection of the references: (a) study types–randomized control trials, quasi experiments, surveys, descriptive analyses, literature reviews; and (b) target population and sample.

This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Central Region (Colorado, Kansas, Missouri, Nebraska, North Dakota, South Dakota, Wyoming), which is served by the Regional Educational Laboratory Central at Marzano Research. This memorandum was prepared by REL Central under a contract with the U.S. Department of Education’s Institute of Education Sciences (IES), Contract ED-IES-17-C-0005, administered by Marzano Research. Its content does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.