NCEE Blog

National Center for Education Evaluation and Regional Assistance

Leading experts provide evidence-based recommendations on using technology to support postsecondary student learning

By Michael Frye and Sarah Costelloe. Both are part of Abt Associates team working on the What Works Clearinghouse.

Technology is part of almost every aspect of college life. Colleges use technology to improve student retention, offer active and engaging learning, and help students become more successful learners. The What Works Clearinghouse’s latest practice guide, Using Technology to Support Postsecondary Student Learning, offers several evidence-based recommendations to help higher education instructors, instructional designers, and administrators use technology to improve student learning outcomes.

IES practice guides incorporate research, practitioner experience, and expert opinions from a panel of nationally recognized experts. The panel that developed Using Technology to Support Postsecondary Student Learning included five experts with many years of experience leading the adoption, use, and research of technology in postsecondary classrooms.  Together, guided by Abt Associates’ review of the rigorous research on the topic, the Using Technology to Support Postsecondary Student Learning offers five evidence-based recommendations:

Practice Recommendations: Use communication and collaboration tools to increase interaction among students and between students and instructors, Minimal evidence. 2. Use varied, personalized, and readily available digital resources to design and deliver instructional content, moderate evidence. 3. Incorporate technology that models and fosters self-regulated learning strategies. Moderate evidence. 4. Use technology to provide timely and targeted feedback on student performance, moderate evidence. 5. Use simulation technologies that help students engage in complex problem-solving, minimal evidence.

 

Each recommendation is assigned an evidence level of minimal, moderate, or strong. The level of evidence reflects how well the research demonstrates the effectiveness of the recommended practices. For an explanation of how levels of evidence are determined, see the Practice Guide Level of Evidence Video.   The evidence-based recommendations also include research-based strategies and examples for implementation in postsecondary settings. Together, the recommendations highlight five interconnected themes that the practice guide’s authors suggest readers consider:

  • Focus on how technology is used, not on the technology itself.

“The basic act of teaching has actually changed very little by the introduction of technology into the classroom,” said panelist MJ Bishop, “and that’s because simply introducing a new technology changes nothing unless we first understand the need it is intended to fill and how to capitalize on its unique capabilities to address that need.” Because technology evolves rapidly, understanding specific technologies is less important than understanding how technology can be used effectively in college settings. “By understanding how a learning outcome can be enhanced and supported by technologies,” said panelist Jennifer Sparrow, “the focus stays on the learner and their learning.”

  • Technology should be aligned to specific learning goals.

Every recommendation in this guide is based on one idea: finding ways to use technology to engage students and enhance their learning experiences. Technology can engage students more deeply in learning content, activate their learning processes, and provide the social connections that are key to succeeding in college and beyond. To do this effectively, any use of technology suggested in this guide must be aligned with learning goals or objectives. “Technology is not just a tool,” said Panel Chair Nada Dabbagh. “Rather, technology has specific affordances that must be recognized to use it effectively for designing learning interactions. Aligning technology affordances with learning outcomes and instructional goals is paramount to successful learning designs.”

  • Pay attention to potential issues of accessibility.

The Internet is ubiquitous, but many households—particularly low-income households and those of recent immigrants and in rural communities—may not be able to afford or otherwise access digital communications. Course materials that rely heavily on Internet access may put these students at a disadvantage. “Colleges and universities making greater use of online education need to know who their students are and what access they have to technology,” said panelist Anthony Picciano. “This practice guide makes abundantly clear that colleges and universities should be careful not to be creating digital divides.”

Instructional designers must also ensure that learning materials on course websites and course/learning management systems can accommodate students who are visually and/or hearing impaired. “Technology can greatly enhance access to education both in terms of reaching a wide student population and overcoming location barriers and in terms of accommodating students with special needs,” said Dabbagh. “Any learning design should take into consideration the capabilities and limitations of technology in supporting a diverse and inclusive audience.”

  • Technology deployments may require significant investment and coordination.

Implementing any new intervention takes training and support from administrators and teaching and learning centers. That is especially true in an environment where resources are scarce. “In reviewing the studies for this practice guide,” said Picciano, “it became abundantly clear that the deployment of technology in our colleges and universities has evolved into a major administrative undertaking. Careful planning that is comprehensive, collaborative, and continuous is needed.”

“Hardware and software infrastructure, professional development, academic and student support services, and ongoing financial investment are testing the wherewithal of even the most seasoned administrators,” said Picciano. “Yet the dynamic and changing nature of technology demands that new strategies be constantly evaluated and modifications made as needed.”

These decisions are never easy. “Decisions need to be made,” said Sparrow, “about investment cost versus opportunity cost. Additionally, when a large investment in a technology has been made, it should not be without investment in faculty development, training, and support resources to ensure that faculty, staff, and students can take full advantage of it.”

  • Rigorous research is limited and more is needed.

Despite technology’s ubiquity in college settings, rigorous research on the effects of technological interventions on student outcomes is rather limited. “It’s problematic,” said Bishop, “that research in the instructional design/educational technology field has been so focused on things, such as technologies, theories, and processes, rather than on the problems we’re trying to solve with those things, such as developing critical thinking, enhancing knowledge transfer, and addressing individual differences. It turns out to be very difficult to cross-reference the instructional design/educational technology literature with the questions the broader field of educational research is trying to answer.”

More rigorous research is needed on new technologies and how best to support instructors and administrators in using them. “For experienced researchers as well as newcomers,” said Picciano, “technology in postsecondary teaching and learning is a fertile ground for further inquiry and investigation.”

Readers of this practice guide are encouraged to adapt the advice provided to the varied contexts in which they work. The five themes discussed above serve as a lens to help readers approach the guide and decide whether and how to implement some or all of the recommendations.

Download Using Technology to Support Postsecondary Student Learning from the What Works Clearinghouse website at https://ies.ed.gov/ncee/wwc/PracticeGuide/25.

 

Sharing strategies to increase research-based educational practices

By Cora Goldston, REL Midwest

Highlighted Resources

How can states, districts, and schools identify effective practices to address challenges and achieve their goals? Education research can point the way, but sometimes finding and accessing relevant research can be a frustrating and time-consuming process. And even when practitioners can find research, it can be difficult to determine a study’s rigor and the strength of research evidence supporting interventions.

Equipping practitioners to use research evidence

Through the Midwest Alliance to Improve Knowledge Utilization (MAIKU), the Regional Educational Laboratory (REL) Midwest is partnering with practitioners to help states, districts, and schools use research to inform practice. The goal is to make it easier for educators to find research relevant to their priorities, assess the level of evidence that supports potential practices, and implement those practices that are based on strong evidence.

REL Midwest and MAIKU are supporting the use of research in education practice in several ways. For example, REL Midwest provided coaching sessions for the Ohio Department of Education (ODE) on understanding the Every Student Succeeds Act (ESSA) tiers of evidence. In addition, REL Midwest created a crosswalk that shows how the ESSA evidence tiers align with ratings from research clearinghouses, such as the What Works Clearinghouse. In turn, ODE is using this information to help Ohio districts that are applying for Striving Readers grants. To receive the grants, districts must demonstrate that they plan to use research-based practices to improve student literacy. As a result of REL Midwest’s support, ODE has strengthened its capacity to help districts determine the level of evidence supporting certain practices and, thus, to submit stronger grant applications.

REL Midwest is providing similar support across the region. In Michigan, we are conducting coaching sessions for the state Department of Education to help agency leadership choose priorities from the state’s Top 10 in 10 plan, identify research-based practices that support those priorities, and collaborate to implement new state-level practices. In Wisconsin, REL Midwest hosted a training series for the Department of Public Instruction to increase the agency’s capacity to collect, analyze, and use data to adjust state-level policies and practices. And in Illinois, REL Midwest is holding a training series for the State Board of Education on research methods, data collection, and data analysis and how to use the findings to inform agency practices.

June webinar on increasing evidence use

MAIKU is also working with researchers to support evidence use in education practice. On June 19, 2018, REL Midwest and MAIKU hosted a webinar to discuss how researchers can share evidence with practitioners in useful and accessible ways.

The webinar featured a presentation by Alan J. Daly, Ph.D., of the University of California at San Diego, and Kara Finnigan, Ph.D., of the University of Rochester. Dr. Daly and Dr. Finnigan discussed how information-sharing networks are structured among school and district staff and the challenges for practitioners in accessing and using research-based practices.   

Building on this context, Dr. Daly and Dr. Finnigan shared insights about the most effective ways to maximize the reach of research. One of their key findings is that the pattern of people’s social ties makes a difference for sharing and using research-based practices. Finnigan and Daly noted that the set of relationships we have can increase access to research evidence if the right ties are present but can constrain access to resources when those ties are not present. The quality of relationships also matters; high levels of trust are essential for more in-depth exchanges of information. The takeaway: fostering both the quantity and quality of social relationships is important for sharing research evidence.  

During the webinar, Jaime Singer, senior technical assistance consultant at REL Midwest, also shared actionable strategies that researchers can use to support evidence use in practice, including training and coaching sessions, checklists, blog posts, and clearinghouses of effective practices.

The webinar included a panel discussion about REL Midwest’s ESSA evidence tiers coaching sessions and crosswalk for ODE. REL Midwest researcher Lyzz Davis, Ph.D., provided a researcher perspective on developing resources to meet ODE’s needs. Heather Boughton, Ph.D., and Melissa Weber-Mayrer, Ph.D., at ODE provided practitioner perspectives on how REL Midwest’s work has strengthened the agency’s capacity to help districts find and use evidence-based interventions.

Looking for evidence outside of the scope of the WWC?

by Chris Weiss and Erin Pollard, What Works Clearinghouse

The What Works Clearinghouse (WWC) strives to be a central and trusted source of research evidence for what works in education. But did you know that the WWC is one of several repositories of evidence produced by the federal government? Our mission at the WWC is to review the existing research on different programs, products, practices, and policies in education to provide educators with the information they need to make evidence-based decisions. However, there are several other government repositories that review evidence on interventions that impact children and schools, reviews that may be of use and interest to WWC users.

 

Different Clearinghouses for Different Needs.

The mission of the different clearinghouses and the reasons for different reviews stems from the unique mission of each agency and the specific focus of the clearinghouse. The Department of Education focuses primarily on prekindergarten through postsecondary education; however, many public health and crime prevention programs are implemented through schools. So, for example, you would find information about a school-based bullying prevention program on the National Institute of Justice’s Crime Solutions website. The WWC would not review the evidence of this program’s effectiveness because its aim is to reduce bullying and victimization, rather than education-focused outcomes.

 

Some interventions are reviewed by multiple clearinghouses.

Users are often surprised that an intervention might be reviewed by multiple clearinghouses. For example, the WWC reviewed the evidence and created an intervention report on Career Academies, a school-within-school program where students take both career-related and academic courses, as well as acquire work experience. But reviews of the program are included in other clearinghouses. The Department of Labor’s CLEAR reviewed the study because of the intervention’s increase of student’s earnings. Similarly, the National Institute of Justice’s Crime Solutions has reviewed the intervention because it showed an effect on increasing earnings of young men – an economic factor linked to lowered risk of criminal activity. Each clearinghouse looked at different outcomes from the same study to highlight the domains they find most relevant to achieving their goal.

 

Each repository is different. The WWC may be your best bet – or others may fit your needs better.

We encourage users to look at the other clearinghouses to find information on outcomes that are outside of our scope. These sites have a lot of great information to offer. Here is a list of the other repositories for finding evidence:

  • Clearinghouse for Labor Evaluation and Research (CLEAR) – Department of Labor. CLEAR's mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies.
  • Compendium of Evidence-Based Interventions and Best Practices for HIV Prevention - Centers for Disease Control and Prevention. Evidence-Based Interventions and Best Practices in the Compendium are identified by CDC’s Prevention Research Synthesis Project through a series of ongoing systematic reviews. Each eligible intervention is evaluated against explicit a priori criteria and has shown sufficient evidence that the intervention works. Interventions may fall into one or more chapters including: Risk Reduction that includes PrEP-related outcomes and outcomes such as injection drug use, condom use, HIV/STD/Hepatitis infection; Linkage to, Retention in, and Re-engagement in HIV Care that includes outcomes such as entering and staying in HIV care; Medication Adherence that includes outcomes such as adhering to HIV medication and HIV viral load; and the most recently added Structural Interventions that includes outcomes such as HIV testing, social determinants of health, and stigma. Information sheets are available for all identified evidence-based interventions and best practices on the PRS Compendium Website.
  • CrimeSolutions - National Institute of Justice, Department of Justice. The clearinghouse, accessible via the CrimeSolutions.gov website, present programs and practices that have undergone rigorous evaluations and meta-analyses. The site assesses the strength of the evidence about whether these programs achieve criminal justice, juvenile justice, and crime victim services outcomes in order to inform practitioners and policy makers about what works, what doesn't, and what's promising.
  • Evidence Exchange - Corporation for National and Community Service. A digital repository of sponsored research, evaluation reports, and data. These resources focus on national service, volunteering, and civic engagement.
  • Home Visiting Evidence of Effectiveness (HomVEE) – Administration for Children and Families, Department of Health and Human Services. HomVEE provides an assessment of the evidence of effectiveness for home visiting models that target families with pregnant women and children from birth to kindergarten entry (that is, up through age 5).
  • Teen Pregnancy Prevention (TPP) Evidence Review – Department of Health and Human Services. A transparent systematic review of the teen pregnancy prevention literature to identify programs with evidence of effectiveness in reducing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors.
  • The Community Guide - Community Preventive Services Task Force (CPSTF). A collection of evidence-based findings to help you select interventions to improve health and prevent disease in your state, community, community organization, business, healthcare organization, or school. The CPSTF issues findings based on systematic reviews of effectiveness and economic evidence that are conducted with a methodology developed by the CPSTF.
  • youth.gov – Interagency. The youth.gov Program Directory features evidence-based programs whose purpose is to prevent and/or reduce delinquency or other problem behaviors in young people.

The WWC Evidence Standards: A Valuable and Accessible Resource for Teaching Validity Assessment of Causal Inferences to Identify What Works

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

 

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008).  During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

 

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

 

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

 

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

 

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

 

References

Boruch, R. F. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks, CA: Sage Publications.

Valentine, J.C., & Cooper, H. (2008), A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130-149.

Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, D.C.: Brookings Institution Press.

Making the WWC Open to Everyone by Moving WWC Certification Online

In December 2016, the What Works Clearinghouse made a version of its online training publicly available through the WWC Website. This enabled everyone to be able to access the Version 3.0 Group Design Standards reviewer training to learn about the standards and methods that the WWC uses. While this was a great step to increase access to WWC resources, users still had to go through the 1 ½ day, in-person training to become a WWC certified reviewer.

To continue our efforts to promote access and transparency and make our resources available to everyone, the WWC has now moved all of its group design training to be online. Now everyone will have access to the same training and certification tests. This certification is available free of charge and is open to all users. It is our hope that this effort will increase the number of certified reviewers and help increase general awareness about the WWC.

Why did the WWC make these resources publicly available? As part of IES’s effort to increase access to high quality education research, we wanted to make it easier for researchers to use our standards. This meant opening up training opportunities and offering training online was a way to achieve this goal while using limited taxpayer resources most efficiently.

The online training consists of 9 modules. These videos feature an experienced WWC instructor and use the same materials that we used in our in-person courses, but adapted to Version 4.0 of the Group Design Standards. After completing the modules, users will have the opportunity to download a certificate of completion, take the online certification test, or go through the full certification exam.

Becoming a fully certified reviewer will require users to take a multiple choice online certification test and then use the new Online SRG application to conduct a full review using the same tools that the WWC team uses. The WWC team will then grade your exam to make sure you fully understand how to apply the Standards before certifying you to review for the Clearinghouse.

Not interested in becoming a certified reviewer? Online training still has several benefits. Educators can embed our videos in their course websites and use our training materials in their curricula. Researchers can use our Online SRG tool with their publications to determine a preliminary rating and understand what factors could cause their study to get the highest rating. They could also use the tool to use when conducting a systematic evidence review.

Have ideas for new resources we could make available? Email your ideas and suggestions to Contact.WWC@ed.gov!

by Erin Pollard, WWC Project Officer

 

Improving the WWC Standards and Procedures

By Chris Weiss and Jon Jacobson

For the What Works Clearinghouse (WWC), standards and procedures are at the foundation of the WWC’s work to provide scientific evidence for what works in education. They guide how studies are selected for review, what elements of an effectiveness study are examined, and how systematic reviews are conducted. The WWC’s standards and procedures are designed to be rigorous and reflective of best practices in research and statistics, while also being aspirational to help point the field of education effectiveness research toward an ever-higher quality of study design and analysis.

To keep pace with new advances in methodological research and provide necessary clarifications for both education researchers and decision makers, the WWC regularly updates its procedures and standards and shares them with the field. We recently released Version 4.0 of the Procedures and Standards Handbooks, which describes the five steps of the WWC’s systematic review process.

For this newest version, we have divided information into two separate documents (see graphic below).  The Procedures Handbook describes how the WWC decides which studies to review and how it reports on study findings. The Standards Handbook describes how the WWC rates the evidence from studies.

The new Standards Handbook includes several improvements, including updated and overhauled standards for cluster-level assignment of students; a new approach for reviewing studies that have some missing baseline or outcome data; and revised standards for regression discontinuity designs. The new Procedures Handbook includes a revised discussion of how the WWC defines a study.  All of the changes are summarized on the WWC website (PDF).

Making the Revisions

These updates were developed in a careful, collaborative manner that included experts in the field, external peer review, and input from the public.

Staff from the Institute of Education Sciences oversaw the process with the WWC’s Statistical, Technical, and Analysis Team (STAT), a panel of highly experienced researchers who revise and develop the WWC standards. In addition, the WWC sought and received input from experts on specific research topics, including regression discontinuity designs, cluster-level assignment, missing data, and complier average causal effects. Based on this information, drafts of the standards and procedures handbooks were developed.

External peer reviewers then provided input that led to additional revisions and, in the summer, the WWC posted drafts and gathered feedback from the public. The WWC’s response to some of the comments is available on its website (PDF).   

Version 4.0 of the Handbooks was released on October 26. This update focused on a few key areas of the standards, and updated and clarified some procedures. However, the WWC strives for continuous improvement and as the field of education research continues to evolve and improve, we expect that there will be new techniques and new tools incorporated into future versions the Handbooks.

Your thoughts, ideas, and suggestions are welcome and can be submitted through the WWC help desk.

The What Works Clearinghouse Goes to College

By Vanessa Anderson, Research Scientist, NCEE

The What Works Clearinghouse (WWC) was founded in 2002 and, in its first decade, focused mainly on reviewing studies of programs, policies, products and practices—or interventions—for improving student outcomes in pre-K, elementary and secondary schools. But in 2012, the WWC broadened its focus and has been using rigorous standards to review studies of interventions designed to increase the success of students in postsecondary education.

This week, the WWC launches a new topic—Supporting Postsecondary Success—and it is a good time to look at the work we’re doing, and will do, in the postsecondary area. 

The WWC postsecondary topic area includes reviews of studies on a wide range of interventions, including learning communities, summer bridge programs, multi-faceted support programs, academic mentoring, and interventions that aim to reduce performance anxiety. As of today, 294 postsecondary studies have been reviewed by the WWC. Those reviews are summarized in six Intervention Reports, 25 Single Study Reviews, and four Quick Reviews. And there’s much more in the works!  For instance, a WWC Educator’s Practice Guide that includes strategies for supporting students in developmental education is planned for publication later this year. (Learn more about Practice Guides)

Identifying Studies for Review

In the postsecondary topic area, there are currently three main ways that studies are identified by the WWC for review.

The first is studies that are reviewed for WWC Intervention Reports. All WWC Intervention Reports use a systematic review process to summarize evidence from all available studies on a given intervention. The WWC conducts a broad search for all publicly available studies of interventions that are related to the topic. This process often identifies hundreds of studies for review. The effectiveness studies are then reviewed against WWC standards. Only the highest quality studies are summarized in an Intervention Report.

We released two new intervention reports this week as part of our new Supporting Postsecondary Success topic. You can view the new Intervention Reports on Summer Bridge programs and first-year experience courses on the WWC website.

The second way that studies are reviewed by the WWC is through Quick Reviews, which are performed on studies that have received a great deal of media attention. In these reports, the WWC provides a brief description of the study, the author-reported results, and a study rating. We like to think of Quick Reviews as a way to help people decide whether to fully believe the results of a study, based on the research design and how the study was conducted. For example, we released a quick review earlier this month that focused on a study of computer usage and student outcomes for a class at the U.S. Military Academy at West Point.

Finally, the WWC reviews postsecondary studies submitted as supporting evidence for discretionary grant competitions funded by the U.S. Department of Education, such as the Strengthening Institutions Program, First in the World and TRIO Student Support Services. These grant competitions require applicants to submit studies as evidence of the effectiveness of the interventions they propose to implement. The WWC reviews these studies and includes the results of those reviews in our database.

If you want to see all the studies on postsecondary interventions that have been reviewed by WWC you can check out—and download—the Reviewed Studies Database. In the “Topic Areas” dropdown menu, just select “Postsecondary,” and then easily customize the search by rating, publication type, and/or reasons for the review (such as a grant competition).  

For more information, visit the WWC postsecondary topic area on the website. To stay up-to-date on WWC news, information, and products, follow us on Facebook, Twitter and sign up for the WWC newsflash!

Five Reasons to Visit the What Works Clearinghouse

By Diana McCallum, Education Research Analyst, What Works Clearinghouse

It’s been more than a decade since the first What Works Clearinghouse reports were released and we have a wealth of information and resources that can help educators and leaders make evidence-based decisions about teaching and learning. Since 2005, the WWC has assessed more than 11,500 education studies using rigorous standards and has published hundreds of resources and guides across many content areas. (View the full version of the graphic to the right.) 

The WWC website has already received more than 1.7 million page views this year, but if you haven’t visited whatworks.ed.gov lately, here are five reasons you might want to click over:

1) We are always adding new and updated reviews. Multiple claims about programs that work can be overwhelming and people often lack time to sift through piles of research. That’s where the WWC comes in. We provide an independent, objective assessment of education research. For example, we have intervention reports that provide summaries of all of the existing research on a given program or practice that educators can use to help inform their choices.  In addition, when a new education study grabs headlines, the WWC develops a quick review that provides our take on the evidence presented to let you know whether the study is credible. In 2015, we added 43 publications to WWC and we’re adding more every month this year.

2) We’ve expanded our reach into the Postsecondary area. In late 2012, the WWC expanded its focus to include reviews of studies within the Postsecondary area to capture the emerging research on studies on a range of topics, from the transition to college to those that focus on postsecondary success.  To date, the WWC has reviewed over 200 studies on postsecondary programs and interventions, and this area continues grow rapidly. In fact, several Office of Postsecondary Education grant competitions add competitive priority preference points for applicants that submit studies that meet WWC standards. (Keep an eye out for a blog post on the postsecondary topic coming soon!)

3) You can find what works using our online tool. Wondering how to get started with so many resources at your fingertips? Find What Works lets you do a quick comparison of interventions for different subjects, grades, and student populations. Want to know more about a specific intervention? We’ve produced more than 400 intervention reports to provide you the evidence about a curriculum, program, software product, or other intervention for your classroom before you choose it.  Recently, we’ve added a feature that allows a user to search for interventions that have worked for different populations of students and in different geographic locations. As we mentioned in a recent blog post, the Find What Works tool is undergoing an even bigger transformation this September, so keep visiting!

4) We identify evidence-based practices to use in the classroom. The WWC has produced 19 practice guides that feature practical recommendations and instructional tips to help educators address common challenges. Practice guides (now available for download as ebooks) provide quick, actionable guidance for educators that are supported by evidence and expert knowledge within key areas.  Some of our guides now feature accompanying videos and brief summaries that demonstrate recommended practices and highlight the meaning behind the levels of evidence. The work of practice guides are also actively disseminated during Regional Educational Laboratory (REL) Bridge events. For instance, REL Southwest held a webinar on Teaching Math to Young Children, which was based on a WWC practice guide. For more information, read a previously published blog post on practice guides.

5) We compile information by topic. Our “Special Features” pages focus on common themes in education, such as tips for college readiness, information for heading back to school, and guidance for what works in early childhood education. These Special Features provide a starting point to access a variety of WWC resources related to a topic.

In the coming months, we’ll post other blogs that will explore different parts of the WWC and tell you about ongoing improvements. So keep visiting the What Works website or signup to receive emails when we release new reports or resources. You can also follow us on Facebook and Twitter.

The What Works Clearinghouse is a part of the National Center for Education Evaluation and Regional Assistance in the Institute of Education Sciences (IES), the independent research, evaluation, and statistics arm of the U.S. Department of Education. You can learn more about IES’ other work on its website or follow IES on Twitter and Facebook

 

Should ESSA Evidence Definitions and What Works Study Ratings be the Same? No, and Here's Why!

By Joy Lesnick, Acting Commissioner, NCEE

The Every Student Succeeds Act (ESSA), the new federal education law, requires education leaders to take research evidence into account when choosing interventions or approaches. ESSA  defines three “tiers” of evidence—strong, moderate, and promising—based on the type and quality of study that was done and its findings.  

Are the ESSA definitions the same as those of Institute of Education Sciences’ What Works Clearinghouse (WWC)?  Not exactly.  ESSA definitions and WWC standards are more like cousins than twins.

Like ESSA, the WWC has three ratings for individual studies – meets standards without reservations, meets standards with reservations, and does not meet standards. The WWC uses a second set of terms to summarize the results of all studies conducted on a particular intervention. The distinction between one study and many studies is important, as I will explain below.

You may be wondering: now that ESSA is the law of the land, should the WWC revise its standards and ratings to reflect the tiers and terminology described in ESSA?  Wouldn’t the benefit of making things nice and tidy between the two sets of definitions outweigh any drawbacks?

The short answer is no.

The most basic reason is that the WWC’s standards come from a decision-making process that is based in science and vetted through scholarly peer review, all protected by the independent, non-partisan status of the Institute of Education Sciences (IES). This fact is central to the credibility of the WWC’s work.  We like to think of the WWC standards as an anchor representing the best knowledge in the field for determining whether a study has been designed and executed well, and how much confidence we should have in its findings.

WWC Standards Reflect the Most Current Scientific Knowledge – and are Always Evolving

WWC standards were developed by a national panel of research experts. After nearly two years of meetings, these experts came to a consensus about what a research study must demonstrate to give us confidence that an intervention caused the observed changes in student outcomes.

Since the first WWC standards were developed over a decade ago, there have been many methodological and conceptual advances in education research. The good news is that the WWC is designed to keep up with these changes in science. As science has evolved, the WWC standards have evolved, too.

One example is the WWC’s standards for reviewing regression discontinuity (RD) design studies.  The first version of RD standards was developed by a panel of experts in 2012.  Since then, the science about RD studies has made so much progress that the WWC recently convened another panel of experts to update the RD standards. The new RD standards are now on the WWC website to solicit scholarly comment.  

When it Comes to Evidence, More is Better

The evidence tiers in ESSA set a minimum bar, based on one study, to encourage states, districts, and schools to incorporate evidence in their decision making. This is a very important step in the right direction.  But a one-study minimum bar is not as comprehensive as the WWC’s approach.

In science, the collective body of knowledge on a topic is always better than the result of a single study or observation. This is why the primary function of the WWC is to conduct systematic reviews of all of the studies on a program, policy, practice, or approach (the results of which are published in Intervention Reports like the one pictured here).

The results of individual studies are important clues toward learning what works. But multiple studies, in different contexts, with different groups of teachers and students, in different states, and with different real-world implementation challenges tell us much more about how well a program, policy, practice or approach works. And that, really, is what we’re trying to find out.

An Improved WWC Search Tool and Ongoing Support for States and Districts

One area where WWC will make changes is in how users find studies that have certain characteristics described in ESSA’s evidence tiers.  For the past 16 months, the WWC team has been hard at work behind the scenes to develop, code, and user-test a dramatically improved Find What Works tool.  We expect to release this tool, along with other changes to the WWC website, in fall 2016. (More on that in another post, but the picture below offers a sneak preview!)

These changes should further increase the utility of the WWC website, which already gets more than 300,000 hits each month and offers products that are downloaded hundreds of thousands of times each year.

We know that providing information on a website about evidence from rigorous research is just a first step.  States and districts may need additional, customized support to incorporate evidence into their decision-making processes in ways that are much deeper than a cursory check-box approach.

To meet that need, other IES programs are ready to help. For example, IES supports 10 Regional Educational Laboratories (RELs) that provide states and districts with technical support for using, interpreting, and applying research. At least two researchers at every REL are certified as WWC reviewers (meaning they have in-depth knowledge of the WWC standards and how the standards are applied), and every REL has existing relationships with states and districts across the nation and outlying regions. Because the RELs are charged with meeting the needs of their regions, every chief state school officer (or designee) sits on a REL Governing Board, which determines the annual priorities of the REL in that area.

As states prioritize their needs and identify ways to incorporate evidence in their decisions according to the new law, the WWC database of reviewed studies will provide the information they need, and the RELs will be ready to help them use that information in meaningful ways.

 

 

 

Practice Guides: How to Use What Works in the Classroom

By Diana McCallum, NCEE

With new education research released every day, it can be difficult to know which teaching methods and classroom practices have been tested and shown to improve student outcomes. You want to know what really works and how to use evidence-based practices in your school or classroom.

What Works Clearinghouse practice guides help bridge the gap between research and practice by examining the findings from existing research studies and combining them with expert advice about applying these findings in the classroom. For each guide, a team of nationally-recognized practitioners and researchers work closely with the WWC to combine evidence from research with guidance from hands-on experience.

Practice guides offer specific recommendations that include a description of the supporting research, steps for carrying out the recommendation, and strategies you can use to overcome potential challenges. Many of the guides also feature supplementary materials, like videos and summaries, to help you quickly find what you need.

One example is our most recent practice guide, Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students. Mastering algebra helps students move from arithmetic operations to understanding abstract concepts, and is for a key to success in future mathematics courses, including geometry and calculus. Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students presents three evidence-based recommendations educators can use to help students develop a deeper understanding of algebra, promote process-oriented thinking, and encourage precise communication. These recommendations help address common challenges in algebra instruction and focus on:

  • Utilizing the structure of algebraic representations to make connections among problems, solution strategies, and representations; 
  • Incorporating solved problems into classroom instruction and activities to help students develop their algebraic reasoning skills; and
  • Comparing and selecting from alternative algebraic strategies to give students flexibility when solving problems. 

You can read the Practice Guide Summary for a quick overview of these recommendations or spend a few minutes watching videos in which Jon Star, of Harvard University’s Graduate School of Education, explain the recommendations.  

The Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students is just one of 19 practice guides available on the What Works Clearinghouse website. Some of the others are:

  • Teaching Math to Young Children: Preschool and kindergarten teachers can get details on how to improve math lessons with this guide, including strategies to create a math-rich environment. You’ll find examples of classroom activities and games that can supplement lesson plans and provide opportunities for children to learn math.

You can find information and links to all 19 practice guides on our website. We also cover a variety of other math and literacy topics, as well as guides focused on dropout prevention, using data to monitor student progress and make decisions, and preparing students for college.

Visit whatworks.ed.gov to find the practice guide that’s right for you or to suggest a topic you’d like us to explore.

Dr. McCallum is an education research analyst on the What Works Clearinghouse team.

About the What Works Clearinghouse (WWC)

For more than a decade, the goal of the WWC has been to provide educators with the information they need to make evidence-based decisions with the aim of improving student outcomes. Established by the U.S. Department of Education’s Institute of Education Sciences, the WWC strives to be a central and trusted source of scientific evidence on education programs, products, practices, and policies. Follow us on Twitter and Facebook.