IES Blog

Institute of Education Sciences

The Role of RELs in Making WWC Practice Guides Actionable for Educators

Earlier this year, I wrote a short blog about how I envisioned the Regional Educational Laboratories (REL) Program, The What Works Clearinghouse™ (WWC), and the Comprehensive Center Program could work together to take discovery to scale. In it, I promised I would follow-up with more thoughts on a specific—and critically important—example: making WWC Practice Guides actionable for educators. I do so below. At the end of this blog, I pose a few questions on which I welcome comments.

The challenge. The single most important resources the WWC produces are its Practice Guides. Practice Guides evaluate the research on a given topic—say, teaching fractions in elementary and middle school—and boil study findings down to a handful of evidence-based practices for educators. Each practice is given a rating to indicate the WWC’s confidence in the underlying evidence, along with tips for how practices can be implemented in the classroom. In many ways, Practice Guides are IES’s most specific and definitive statements about what works to improve education practice and promote student achievement.

Despite their importance, the amount of effort IES has intentionally dedicated to producing high-quality resources that support educators in implementing Practice Guide recommendations has been uneven. (By most measures, it has been on the decline.) Why? Although we have confidence that the materials we have already produced are high-quality, we cannot prove it. Rigor is part of our DNA, and the absence of systematic efficacy tests demonstrating tools’ contribution to improved teacher practice has made us hesitant to dramatically expand IES-branded resources.

To their credit, several organizations have stepped in to address the “last mile problem” between Practice Guides and classroom practice. Some, like RELs, are IES partners. As a result, we have seen a small number of Practice Guides turned in to professional learning community guides, massively on-line open courses, and other teacher-facing resources. Despite these efforts, similar resources have not been developed for the overwhelming majority of Practice Guides. This means many of our Guides and the dozens of recommendations for evidence-based practice they contain are languishing underused on IES’s virtual bookshelf.

An idea. IES should “back” the systematic transformation of Practice Guide recommendations from words on a page to high-quality materials that support teachers’ use of evidence-based practices in their classrooms. And because we should demonstrate our own practice works, those materials should be tested for efficacy.

From my perspective, RELs are well-suited to this task. This work unambiguously aligns with RELs’ purpose, which is to improve student achievement using scientifically-valid research. It also leverages RELs’ unique value proposition among federal technical assistance providers: the capacity to conduct rigorous research and development activities in partnership with state and local educators. If RELs took on a greater role in supporting Practice Guides in the next REL cycle—which runs from 2022 until 2027—what might it look like in practice?

One model involves RELs collaborating with state and/or district partners to design, pilot, and test a coherent set of resources (a “toolkit”) that help educators bring Practice Guide recommendations to life in the classroom. Potential products might include rubrics to audit current policy or practice, videos of high-quality instructional practice, sample classroom materials, or professional learning community facilitation guides, each linked to one or more Practice Guide recommendations.

Long-time followers of the WWC may recognize the design aspect of this work as similar to the defunct Doing What Works Program. The difference? New resources would not only be developed in collaboration with educators, they must be piloted and tested with them as well. It’s simple, really: if we expect educators to use evidence-based practices in the classroom, we need evidence-based tools to help teachers succeed when implementing them.  

Once vetted, materials must get into the hands of educators who need them. It’s here where the value of the REL-Comprehensive Center partnership becomes clear. With a mission of supporting each state education agency in its school improvement efforts, Regional Comprehensive Centers are in the ideal position to bring resources and implementation supports to state and local education leaders that meet their unique needs. Tools that are developed, piloted, and refined by a REL and educators in a single state can then be disseminated by the national network of Comprehensive Centers to meet other states’ needs.

Extensions. It isn’t hard to imagine other activities that the WWC, RELs, and Comprehensive Centers might take on to maximize this model’s potential effectiveness. Most hinge on building effective feedback loops.

Promoting continuous improvement of Practice Guide resources is an obvious example. RELs could and should be in the business of following Comprehensive Centers as they work with states and districts to implement REL-developed Practice Guide supports, looking for ways to maximize their effectiveness. Similarly, Comprehensive Centers and RELs should be regularly communicating with one another about needs-sensing, identifying areas where support for evidence-based practice is lacking and determining which partners to involve in the solution. When there is a growing body of evidence to support educator best practice, the WWC is in the best position to take the lead and develop a new Practice Guide. When that body of evidence does not exist yet—or when even the practices themselves are underdeveloped—the RELs and other parts of IES, such as the National Centers for Education and Special Education Research, should step in.  

Questions. When the WWC releases a new Practice Guide, its work may be done—at least temporarily. The work of its partners to support take-up of a Guide’s recommendations will, however, have just begun. I’d appreciate your thoughts on how to best accomplish that transition, and offer up the following additional questions for your consideration:

  1. Are we thinking about the problem correctly, and in a helpful way? Are there elements of the problem that should be redefined, and would that lead us to different solutions?

 

  1. What parts of the problem does this proposed solution address well, and where are its shortcomings? Are there other solutions—even solutions that don’t seem to fit squarely within today’s model of the REL Program—that might be more effective?

 

  1. If we proceed under a model like that which is described above:

 

  1. What sort of REL partnership models would be most effective in supporting the conceptualization, design, piloting, and testing of teacher-facing “toolkits” aligned to WWC Practice Guides?

 

  1. What research and evaluation activities—and which outcome measures—should be incorporated into this activity to give IES confidence that the resulting “toolkits” are likely to be associated with changed teacher practice and improved student outcomes?

 

  1. How does the 5-year limit on REL contracts affect the feasibility of this idea, including its scope and cost? What could be accomplished in 5 years, and what might take longer to see to completion?

 

  1. How could RELs leverage existing ED-sponsored content, such as that created by Doing What Works, in service of this new effort?

 

If you have thoughts on these questions or other feedback you would like to share, please e-mail me. I can be reached directly at matthew.soldner@ed.gov. Thanks in advance for the consideration!

by Matthew Soldner, NCEE Commissioner 

Leading experts provide evidence-based recommendations on using technology to support postsecondary student learning

By Michael Frye and Sarah Costelloe. Both are part of Abt Associates team working on the What Works Clearinghouse.

Technology is part of almost every aspect of college life. Colleges use technology to improve student retention, offer active and engaging learning, and help students become more successful learners. The What Works Clearinghouse’s latest practice guide, Using Technology to Support Postsecondary Student Learning, offers several evidence-based recommendations to help higher education instructors, instructional designers, and administrators use technology to improve student learning outcomes.

IES practice guides incorporate research, practitioner experience, and expert opinions from a panel of nationally recognized experts. The panel that developed Using Technology to Support Postsecondary Student Learning included five experts with many years of experience leading the adoption, use, and research of technology in postsecondary classrooms.  Together, guided by Abt Associates’ review of the rigorous research on the topic, the Using Technology to Support Postsecondary Student Learning offers five evidence-based recommendations:

Practice Recommendations: Use communication and collaboration tools to increase interaction among students and between students and instructors, Minimal evidence. 2. Use varied, personalized, and readily available digital resources to design and deliver instructional content, moderate evidence. 3. Incorporate technology that models and fosters self-regulated learning strategies. Moderate evidence. 4. Use technology to provide timely and targeted feedback on student performance, moderate evidence. 5. Use simulation technologies that help students engage in complex problem-solving, minimal evidence.

 

Each recommendation is assigned an evidence level of minimal, moderate, or strong. The level of evidence reflects how well the research demonstrates the effectiveness of the recommended practices. For an explanation of how levels of evidence are determined, see the Practice Guide Level of Evidence Video.   The evidence-based recommendations also include research-based strategies and examples for implementation in postsecondary settings. Together, the recommendations highlight five interconnected themes that the practice guide’s authors suggest readers consider:

  • Focus on how technology is used, not on the technology itself.

“The basic act of teaching has actually changed very little by the introduction of technology into the classroom,” said panelist MJ Bishop, “and that’s because simply introducing a new technology changes nothing unless we first understand the need it is intended to fill and how to capitalize on its unique capabilities to address that need.” Because technology evolves rapidly, understanding specific technologies is less important than understanding how technology can be used effectively in college settings. “By understanding how a learning outcome can be enhanced and supported by technologies,” said panelist Jennifer Sparrow, “the focus stays on the learner and their learning.”

  • Technology should be aligned to specific learning goals.

Every recommendation in this guide is based on one idea: finding ways to use technology to engage students and enhance their learning experiences. Technology can engage students more deeply in learning content, activate their learning processes, and provide the social connections that are key to succeeding in college and beyond. To do this effectively, any use of technology suggested in this guide must be aligned with learning goals or objectives. “Technology is not just a tool,” said Panel Chair Nada Dabbagh. “Rather, technology has specific affordances that must be recognized to use it effectively for designing learning interactions. Aligning technology affordances with learning outcomes and instructional goals is paramount to successful learning designs.”

  • Pay attention to potential issues of accessibility.

The Internet is ubiquitous, but many households—particularly low-income households and those of recent immigrants and in rural communities—may not be able to afford or otherwise access digital communications. Course materials that rely heavily on Internet access may put these students at a disadvantage. “Colleges and universities making greater use of online education need to know who their students are and what access they have to technology,” said panelist Anthony Picciano. “This practice guide makes abundantly clear that colleges and universities should be careful not to be creating digital divides.”

Instructional designers must also ensure that learning materials on course websites and course/learning management systems can accommodate students who are visually and/or hearing impaired. “Technology can greatly enhance access to education both in terms of reaching a wide student population and overcoming location barriers and in terms of accommodating students with special needs,” said Dabbagh. “Any learning design should take into consideration the capabilities and limitations of technology in supporting a diverse and inclusive audience.”

  • Technology deployments may require significant investment and coordination.

Implementing any new intervention takes training and support from administrators and teaching and learning centers. That is especially true in an environment where resources are scarce. “In reviewing the studies for this practice guide,” said Picciano, “it became abundantly clear that the deployment of technology in our colleges and universities has evolved into a major administrative undertaking. Careful planning that is comprehensive, collaborative, and continuous is needed.”

“Hardware and software infrastructure, professional development, academic and student support services, and ongoing financial investment are testing the wherewithal of even the most seasoned administrators,” said Picciano. “Yet the dynamic and changing nature of technology demands that new strategies be constantly evaluated and modifications made as needed.”

These decisions are never easy. “Decisions need to be made,” said Sparrow, “about investment cost versus opportunity cost. Additionally, when a large investment in a technology has been made, it should not be without investment in faculty development, training, and support resources to ensure that faculty, staff, and students can take full advantage of it.”

  • Rigorous research is limited and more is needed.

Despite technology’s ubiquity in college settings, rigorous research on the effects of technological interventions on student outcomes is rather limited. “It’s problematic,” said Bishop, “that research in the instructional design/educational technology field has been so focused on things, such as technologies, theories, and processes, rather than on the problems we’re trying to solve with those things, such as developing critical thinking, enhancing knowledge transfer, and addressing individual differences. It turns out to be very difficult to cross-reference the instructional design/educational technology literature with the questions the broader field of educational research is trying to answer.”

More rigorous research is needed on new technologies and how best to support instructors and administrators in using them. “For experienced researchers as well as newcomers,” said Picciano, “technology in postsecondary teaching and learning is a fertile ground for further inquiry and investigation.”

Readers of this practice guide are encouraged to adapt the advice provided to the varied contexts in which they work. The five themes discussed above serve as a lens to help readers approach the guide and decide whether and how to implement some or all of the recommendations.

Download Using Technology to Support Postsecondary Student Learning from the What Works Clearinghouse website at https://ies.ed.gov/ncee/wwc/PracticeGuide/25.

 

Sharing strategies to increase research-based educational practices

By Cora Goldston, REL Midwest

Highlighted Resources

How can states, districts, and schools identify effective practices to address challenges and achieve their goals? Education research can point the way, but sometimes finding and accessing relevant research can be a frustrating and time-consuming process. And even when practitioners can find research, it can be difficult to determine a study’s rigor and the strength of research evidence supporting interventions.

Equipping practitioners to use research evidence

Through the Midwest Alliance to Improve Knowledge Utilization (MAIKU), the Regional Educational Laboratory (REL) Midwest is partnering with practitioners to help states, districts, and schools use research to inform practice. The goal is to make it easier for educators to find research relevant to their priorities, assess the level of evidence that supports potential practices, and implement those practices that are based on strong evidence.

REL Midwest and MAIKU are supporting the use of research in education practice in several ways. For example, REL Midwest provided coaching sessions for the Ohio Department of Education (ODE) on understanding the Every Student Succeeds Act (ESSA) tiers of evidence. In addition, REL Midwest created a crosswalk that shows how the ESSA evidence tiers align with ratings from research clearinghouses, such as the What Works Clearinghouse. In turn, ODE is using this information to help Ohio districts that are applying for Striving Readers grants. To receive the grants, districts must demonstrate that they plan to use research-based practices to improve student literacy. As a result of REL Midwest’s support, ODE has strengthened its capacity to help districts determine the level of evidence supporting certain practices and, thus, to submit stronger grant applications.

REL Midwest is providing similar support across the region. In Michigan, we are conducting coaching sessions for the state Department of Education to help agency leadership choose priorities from the state’s Top 10 in 10 plan, identify research-based practices that support those priorities, and collaborate to implement new state-level practices. In Wisconsin, REL Midwest hosted a training series for the Department of Public Instruction to increase the agency’s capacity to collect, analyze, and use data to adjust state-level policies and practices. And in Illinois, REL Midwest is holding a training series for the State Board of Education on research methods, data collection, and data analysis and how to use the findings to inform agency practices.

June webinar on increasing evidence use

MAIKU is also working with researchers to support evidence use in education practice. On June 19, 2018, REL Midwest and MAIKU hosted a webinar to discuss how researchers can share evidence with practitioners in useful and accessible ways.

The webinar featured a presentation by Alan J. Daly, Ph.D., of the University of California at San Diego, and Kara Finnigan, Ph.D., of the University of Rochester. Dr. Daly and Dr. Finnigan discussed how information-sharing networks are structured among school and district staff and the challenges for practitioners in accessing and using research-based practices.   

Building on this context, Dr. Daly and Dr. Finnigan shared insights about the most effective ways to maximize the reach of research. One of their key findings is that the pattern of people’s social ties makes a difference for sharing and using research-based practices. Finnigan and Daly noted that the set of relationships we have can increase access to research evidence if the right ties are present but can constrain access to resources when those ties are not present. The quality of relationships also matters; high levels of trust are essential for more in-depth exchanges of information. The takeaway: fostering both the quantity and quality of social relationships is important for sharing research evidence.  

During the webinar, Jaime Singer, senior technical assistance consultant at REL Midwest, also shared actionable strategies that researchers can use to support evidence use in practice, including training and coaching sessions, checklists, blog posts, and clearinghouses of effective practices.

The webinar included a panel discussion about REL Midwest’s ESSA evidence tiers coaching sessions and crosswalk for ODE. REL Midwest researcher Lyzz Davis, Ph.D., provided a researcher perspective on developing resources to meet ODE’s needs. Heather Boughton, Ph.D., and Melissa Weber-Mayrer, Ph.D., at ODE provided practitioner perspectives on how REL Midwest’s work has strengthened the agency’s capacity to help districts find and use evidence-based interventions.

Looking for evidence outside of the scope of the WWC?

by Chris Weiss and Erin Pollard, What Works Clearinghouse

The What Works Clearinghouse (WWC) strives to be a central and trusted source of research evidence for what works in education. But did you know that the WWC is one of several repositories of evidence produced by the federal government? Our mission at the WWC is to review the existing research on different programs, products, practices, and policies in education to provide educators with the information they need to make evidence-based decisions. However, there are several other government repositories that review evidence on interventions that impact children and schools, reviews that may be of use and interest to WWC users.

 

Different Clearinghouses for Different Needs.

The mission of the different clearinghouses and the reasons for different reviews stems from the unique mission of each agency and the specific focus of the clearinghouse. The Department of Education focuses primarily on prekindergarten through postsecondary education; however, many public health and crime prevention programs are implemented through schools. So, for example, you would find information about a school-based bullying prevention program on the National Institute of Justice’s Crime Solutions website. The WWC would not review the evidence of this program’s effectiveness because its aim is to reduce bullying and victimization, rather than education-focused outcomes.

 

Some interventions are reviewed by multiple clearinghouses.

Users are often surprised that an intervention might be reviewed by multiple clearinghouses. For example, the WWC reviewed the evidence and created an intervention report on Career Academies, a school-within-school program where students take both career-related and academic courses, as well as acquire work experience. But reviews of the program are included in other clearinghouses. The Department of Labor’s CLEAR reviewed the study because of the intervention’s increase of student’s earnings. Similarly, the National Institute of Justice’s Crime Solutions has reviewed the intervention because it showed an effect on increasing earnings of young men – an economic factor linked to lowered risk of criminal activity. Each clearinghouse looked at different outcomes from the same study to highlight the domains they find most relevant to achieving their goal.

 

Each repository is different. The WWC may be your best bet – or others may fit your needs better.

We encourage users to look at the other clearinghouses to find information on outcomes that are outside of our scope. These sites have a lot of great information to offer. Here is a list of the other repositories for finding evidence:

  • Clearinghouse for Labor Evaluation and Research (CLEAR) – Department of Labor. CLEAR's mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies.
  • Compendium of Evidence-Based Interventions and Best Practices for HIV Prevention - Centers for Disease Control and Prevention. Evidence-Based Interventions and Best Practices in the Compendium are identified by CDC’s Prevention Research Synthesis Project through a series of ongoing systematic reviews. Each eligible intervention is evaluated against explicit a priori criteria and has shown sufficient evidence that the intervention works. Interventions may fall into one or more chapters including: Risk Reduction that includes PrEP-related outcomes and outcomes such as injection drug use, condom use, HIV/STD/Hepatitis infection; Linkage to, Retention in, and Re-engagement in HIV Care that includes outcomes such as entering and staying in HIV care; Medication Adherence that includes outcomes such as adhering to HIV medication and HIV viral load; and the most recently added Structural Interventions that includes outcomes such as HIV testing, social determinants of health, and stigma. Information sheets are available for all identified evidence-based interventions and best practices on the PRS Compendium Website.
  • CrimeSolutions - National Institute of Justice, Department of Justice. The clearinghouse, accessible via the CrimeSolutions.gov website, present programs and practices that have undergone rigorous evaluations and meta-analyses. The site assesses the strength of the evidence about whether these programs achieve criminal justice, juvenile justice, and crime victim services outcomes in order to inform practitioners and policy makers about what works, what doesn't, and what's promising.
  • Evidence Exchange - Corporation for National and Community Service. A digital repository of sponsored research, evaluation reports, and data. These resources focus on national service, volunteering, and civic engagement.
  • Home Visiting Evidence of Effectiveness (HomVEE) – Administration for Children and Families, Department of Health and Human Services. HomVEE provides an assessment of the evidence of effectiveness for home visiting models that target families with pregnant women and children from birth to kindergarten entry (that is, up through age 5).
  • Teen Pregnancy Prevention (TPP) Evidence Review – Department of Health and Human Services. A transparent systematic review of the teen pregnancy prevention literature to identify programs with evidence of effectiveness in reducing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors.
  • The Community Guide - Community Preventive Services Task Force (CPSTF). A collection of evidence-based findings to help you select interventions to improve health and prevent disease in your state, community, community organization, business, healthcare organization, or school. The CPSTF issues findings based on systematic reviews of effectiveness and economic evidence that are conducted with a methodology developed by the CPSTF.
  • youth.gov – Interagency. The youth.gov Program Directory features evidence-based programs whose purpose is to prevent and/or reduce delinquency or other problem behaviors in young people.

The WWC Evidence Standards: A Valuable and Accessible Resource for Teaching Validity Assessment of Causal Inferences to Identify What Works

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

 

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008).  During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

 

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

 

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

 

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

 

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

 

References

Boruch, R. F. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks, CA: Sage Publications.

Valentine, J.C., & Cooper, H. (2008), A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130-149.

Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, D.C.: Brookings Institution Press.