IES Blog

Institute of Education Sciences

In Nebraska, a focus on evidence-based reading instruction

Researchers have learned a lot about reading over the last few decades, but these insights don’t always make their way into elementary school classrooms. In Nebraska, a fresh effort to provide teachers with practical resources on reading instruction could help bridge the research-practice divide.

At the start of the 2019-2020 school year, the Nebraska Department of Education (NDE) implemented NebraskaREADS to help “serve the needs of students, educators, and parents along the journey to successful reading.” As part of a broader effort guided by the Nebraska Reading Improvement Act, the initiative emphasizes the importance of high-quality reading instruction and targeted, individualized support for struggling readers.

As part of NebraskaREADS, NDE is developing an online resource inventory of tools and information to support high-quality literacy instruction for all Nebraska students. After reaching out to REL Central for support, experts from both agencies identified the What Works Clearinghouse (WWC) as a prime source for evidenced-based information on instructional practices and policies and partnered with REL Central to develop a set of instructional strategies summary documents based on recommendations in eight WWC practice guides.

Each WWC practice guide presents recommendations for educators based on reviews of research by a panel of nationally recognized experts on a particular topic or challenge. NDE’s instructional strategies summary documents align the evidence-based strategies in the practice guides more directly with NebraskaREADS.

REL Central and NDE partners used the WWC practice guides to develop the 35 instructional strategies summary documents, each of which condenses one recommendation from a practice guide. For each recommendation, the instructional strategies summary document provides the associated NebraskaREADS literacy focus, implementation instructions, appropriate grade levels, potential roadblocks and ways to address them, and the strength of supporting evidence.

NDE launched the instructional strategies summary documents in spring 2019, and already educators have found them to be helpful resources. Dee Hoge, executive director of Bennington Public Schools, described the guides as “checkpoints for strong instruction” and noted her district’s plan to use them to adapt materials and provide professional development. Marissa Payzant, English language arts education specialist at NDE and a lead partner in this collaboration, shared that the “districts are excited to incorporate the strategy summaries in a variety of professional learning and other initiatives. They make the information in the practice guides much more accessible, and all the better it’s from a trustworthy source.”

Building on the positive reception and benefits of the instructional strategies summary documents, NDE and REL Central plan to expand the resource inventory into other content areas. Currently, they are working together to develop instructional strategy summaries using the WWC practice guides for math instruction.

by Douglas Van Dine​, Regional Educational Laboratory Central

Taking Discovery to Scale

Along with my NCEE colleagues, I was excited to read the recent Notice Inviting Applications for the next cycle of Comprehensive Centers, administered by the Department’s Office of Elementary and Secondary Education.

As you can see in the notice, Regional Comprehensive Centers will “provide high-quality intensive capacity-building services to State clients and recipients to identify, implement, and sustain effective evidence-based programs, practices, and interventions that support improved educator and student outcomes,” with a special emphasis on benefitting disadvantaged students, students from low-income families, and rural populations.

With this focus on supporting implementation, Regional Comprehensive Centers (RCCs) can amplify the work of NCEE’s Regional Educational Laboratories (RELs) and What Works Clearinghouse (WWC). Learning from states, districts, and schools to understand their unique needs, and then being able to support high-quality implementation of evidence-based practices that align with those needs, has the potential to dramatically accelerate the process of improving outcomes for students.

RELs and the WWC already collaborate with today’s Comprehensive Centers, of course. But it’s easy to see how stronger and more intentional relationships between them could increase each program’s impact.

True to its name, the REL program has worked with educators to design and evaluate innovative practices – or identify, implement, and refine existing ones – to meet regional and local needs for more than 50 years. And since its inception in 2002, the WWC has systematically identified and synthesized high-quality evidence about the effectiveness of education programs, policies, and practices so that educators and other instructional leaders can put that information to use improving outcomes for students. But with more than 3.6 million teachers spread across more than 132,000 public and private schools nation-wide, making sure discoveries from education science are implemented at scale and with fidelity is no small feat. RCCs are welcome partners in that work.

This figure describes how RELs, the What Works Clearinghouse, and Regional Comprehensive Centers could most effectively collaborate across a continuum from discovery to scale.

RELs, the WWC, and Comprehensive Centers can play critical, complementary roles in taking discovery to scale (see Figure). With their analysis, design, and evaluation expertise, RELs – in partnership with states and districts, postsecondary institutions, and other stakeholders – can begin the process by designing and rigorously evaluating best practices that meet local or regional needs. (Or, as I will discuss in future messages, by developing and rigorously testing materials that support adoption of evidence-based practices.) The WWC follows, vetting causal impact studies, synthesizing their findings to better understand the strength of evidence that supports a practice and identifying its likely impact. Partners in the Comprehensive Centers can then “pick-up” those WWC-vetted practices, aligning them to needs of State and other clients, and supporting and sustaining implementation at scale. Finally, lessons learned from RCCs’ implementation efforts about what worked – and what didn’t – can be fed back to RELs, refining the practice and fueling the next cycle of discovery.

Those that follow the REL-WWC-RCC process know that what I’ve just described isn’t quite how these programs operate today. Sometimes, out of necessity, roles are more “fluid” and efforts are somewhat less well-aligned. The approach of “taking discovery to scale” depicted above provides one way of thinking about how each program can play a unique, but interdependent, role with the other two.

I have every confidence this is possible. After all, the North star of each program is the same: improving outcomes for students. And that means we have a unique opportunity. One we’d be remiss not to seize.

 

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance
Institute of Education Sciences
U.S. Department of Education

 

As always, your feedback is welcome. You can email the Commissioner at matthew.soldner@ed.gov.

 

 

Leading experts provide evidence-based recommendations on using technology to support postsecondary student learning

By Michael Frye and Sarah Costelloe. Both are part of Abt Associates team working on the What Works Clearinghouse.

Technology is part of almost every aspect of college life. Colleges use technology to improve student retention, offer active and engaging learning, and help students become more successful learners. The What Works Clearinghouse’s latest practice guide, Using Technology to Support Postsecondary Student Learning, offers several evidence-based recommendations to help higher education instructors, instructional designers, and administrators use technology to improve student learning outcomes.

IES practice guides incorporate research, practitioner experience, and expert opinions from a panel of nationally recognized experts. The panel that developed Using Technology to Support Postsecondary Student Learning included five experts with many years of experience leading the adoption, use, and research of technology in postsecondary classrooms.  Together, guided by Abt Associates’ review of the rigorous research on the topic, the Using Technology to Support Postsecondary Student Learning offers five evidence-based recommendations:

Practice Recommendations: Use communication and collaboration tools to increase interaction among students and between students and instructors, Minimal evidence. 2. Use varied, personalized, and readily available digital resources to design and deliver instructional content, moderate evidence. 3. Incorporate technology that models and fosters self-regulated learning strategies. Moderate evidence. 4. Use technology to provide timely and targeted feedback on student performance, moderate evidence. 5. Use simulation technologies that help students engage in complex problem-solving, minimal evidence.

 

Each recommendation is assigned an evidence level of minimal, moderate, or strong. The level of evidence reflects how well the research demonstrates the effectiveness of the recommended practices. For an explanation of how levels of evidence are determined, see the Practice Guide Level of Evidence Video.   The evidence-based recommendations also include research-based strategies and examples for implementation in postsecondary settings. Together, the recommendations highlight five interconnected themes that the practice guide’s authors suggest readers consider:

  • Focus on how technology is used, not on the technology itself.

“The basic act of teaching has actually changed very little by the introduction of technology into the classroom,” said panelist MJ Bishop, “and that’s because simply introducing a new technology changes nothing unless we first understand the need it is intended to fill and how to capitalize on its unique capabilities to address that need.” Because technology evolves rapidly, understanding specific technologies is less important than understanding how technology can be used effectively in college settings. “By understanding how a learning outcome can be enhanced and supported by technologies,” said panelist Jennifer Sparrow, “the focus stays on the learner and their learning.”

  • Technology should be aligned to specific learning goals.

Every recommendation in this guide is based on one idea: finding ways to use technology to engage students and enhance their learning experiences. Technology can engage students more deeply in learning content, activate their learning processes, and provide the social connections that are key to succeeding in college and beyond. To do this effectively, any use of technology suggested in this guide must be aligned with learning goals or objectives. “Technology is not just a tool,” said Panel Chair Nada Dabbagh. “Rather, technology has specific affordances that must be recognized to use it effectively for designing learning interactions. Aligning technology affordances with learning outcomes and instructional goals is paramount to successful learning designs.”

  • Pay attention to potential issues of accessibility.

The Internet is ubiquitous, but many households—particularly low-income households and those of recent immigrants and in rural communities—may not be able to afford or otherwise access digital communications. Course materials that rely heavily on Internet access may put these students at a disadvantage. “Colleges and universities making greater use of online education need to know who their students are and what access they have to technology,” said panelist Anthony Picciano. “This practice guide makes abundantly clear that colleges and universities should be careful not to be creating digital divides.”

Instructional designers must also ensure that learning materials on course websites and course/learning management systems can accommodate students who are visually and/or hearing impaired. “Technology can greatly enhance access to education both in terms of reaching a wide student population and overcoming location barriers and in terms of accommodating students with special needs,” said Dabbagh. “Any learning design should take into consideration the capabilities and limitations of technology in supporting a diverse and inclusive audience.”

  • Technology deployments may require significant investment and coordination.

Implementing any new intervention takes training and support from administrators and teaching and learning centers. That is especially true in an environment where resources are scarce. “In reviewing the studies for this practice guide,” said Picciano, “it became abundantly clear that the deployment of technology in our colleges and universities has evolved into a major administrative undertaking. Careful planning that is comprehensive, collaborative, and continuous is needed.”

“Hardware and software infrastructure, professional development, academic and student support services, and ongoing financial investment are testing the wherewithal of even the most seasoned administrators,” said Picciano. “Yet the dynamic and changing nature of technology demands that new strategies be constantly evaluated and modifications made as needed.”

These decisions are never easy. “Decisions need to be made,” said Sparrow, “about investment cost versus opportunity cost. Additionally, when a large investment in a technology has been made, it should not be without investment in faculty development, training, and support resources to ensure that faculty, staff, and students can take full advantage of it.”

  • Rigorous research is limited and more is needed.

Despite technology’s ubiquity in college settings, rigorous research on the effects of technological interventions on student outcomes is rather limited. “It’s problematic,” said Bishop, “that research in the instructional design/educational technology field has been so focused on things, such as technologies, theories, and processes, rather than on the problems we’re trying to solve with those things, such as developing critical thinking, enhancing knowledge transfer, and addressing individual differences. It turns out to be very difficult to cross-reference the instructional design/educational technology literature with the questions the broader field of educational research is trying to answer.”

More rigorous research is needed on new technologies and how best to support instructors and administrators in using them. “For experienced researchers as well as newcomers,” said Picciano, “technology in postsecondary teaching and learning is a fertile ground for further inquiry and investigation.”

Readers of this practice guide are encouraged to adapt the advice provided to the varied contexts in which they work. The five themes discussed above serve as a lens to help readers approach the guide and decide whether and how to implement some or all of the recommendations.

Download Using Technology to Support Postsecondary Student Learning from the What Works Clearinghouse website at https://ies.ed.gov/ncee/wwc/PracticeGuide/25.

 

What Works in STEM Education: Resources for National STEM Day, 2018

Are you celebrating National STEM Day this November 8th by learning more about how to improve student achievement in Science, Technology, Engineering, and Mathematics (STEM)? If so, the Institute of Education Sciences’ (IES’s) What Works Clearinghouse has great resources for educators who want information about the latest evidence-based practices in supporting learners of all ages.

  • Focused on math? If so, check out Improving Mathematical Problem Solving in Grades 4 Through 8. Based on 38 rigorous studies conducted over 20 years, this practice guide includes five recommendations that teachers, math coaches, and curriculum developers can use to improve students’ mathematical problem-solving skills. There’s strong evidence that assisting students in monitoring and reflecting on the problem-solving process and teaching students how to use visual representations (e.g., tables, graphs, and number lines) can improve achievement. Other practice guides focus on Teaching Math to Young Children and Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students.

  • Don’t worry, we won’t leave science out! Encouraging Girls in Math and Science includes five evidence-based recommendations that both classroom teachers and other school personnel can use to encourage girls to choose career paths in math- and science-related fields. A handy 20-point checklist provides suggestions for how those recommendations can be incorporated into daily practice, such as “[teaching] students that working hard to learn new knowledge leads to improved performance” and “[connecting] mathematics and science activities to careers in ways that do not reinforce existing gender stereotypes of those careers.”

  • Looking for specific curricula or programs for encouraging success in STEM? If so, check out the What Works Clearinghouse’s Intervention Reports in Math and Science. Intervention reports are summaries of findings from high-quality research on a given educational program, practice, or policy. There are currently more than 200 intervention reports that include at least one math or science related outcome. (And nearly 600 in total!)

  • Maybe you just want to see the research we’ve reviewed? You can! The What Works Clearinghouse’s Reviews of Individual Studies Database includes nearly 11,000 citations across a wide range of topics, including STEM. Type in your preferred search term and you’re off—from algebra to zoology, we’ve got you covered!

We hope you’ll visit us on November 8th and learn more about evidence-based practices in STEM education. And with practice guides, intervention reports, and individual studies spanning topics from Early Childhood to Postsecondary education and everything in-between, we hope you’ll come back whenever you are looking for high-quality research to answer the question “what works in education!”

The WWC Evidence Standards: A Valuable and Accessible Resource for Teaching Validity Assessment of Causal Inferences to Identify What Works

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

 

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008).  During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

 

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

 

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

 

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

 

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

 

References

Boruch, R. F. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks, CA: Sage Publications.

Valentine, J.C., & Cooper, H. (2008), A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130-149.

Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, D.C.: Brookings Institution Press.