NCEE Blog

National Center for Education Evaluation and Regional Assistance

How can we work together to promote achievement for all students?

Matthew Soldner, Commissioner of the National Center for Education Evaluation and Regional Assistance, delivered the remarks below at Regional Educational Laboratory (REL) Midwest’s February 27, 2019 Governing Board Meeting in Chicago, Illinois. The remarks have been edited for clarity and to remove references to specific attendees or projects.

Good evening, and thank you for inviting me to share a few thoughts this evening about the Institute of Education Sciences’ vision for the REL program. I will promise to keep them brief, because I know you want to hear from REL Midwest about the work they have planned for the upcoming year, and I know they want to hear from you about the planned work and how it can be designed to meet your needs and the needs of your stakeholders.

As you meet over the course of the next day, I’d ask that you keep one question in mind throughout: how can our work together, across the various partnerships that are represented in this room, work to promote achievement for all students. I want to spend a moment on a few of those words.

First, achievement. When I talk about achievement, I’m not referring to only test scores or grades. I’m talking about measures and indicators of development and success from early childhood through adulthood, including outcomes in early childhood education, early and middle grades, high school, and college and university. This also must include indicators of success as learners move to and through the workforce.

Second, when I say all students – or, perhaps more precisely, all learners – I mean it in the most inclusive terms. We are deeply committed to ensuring each student, each learner, is well-served by our systems of education – from pre-Kindergarten to adult education, and all levels in-between.  

So what must we, as a REL program, do to work toward that goal? I think most of us would agree that nothing changes for students if adults don’t begin to do things differently and, hopefully, better.

That means our work must be focused on action. The kind of action you tell us is most needed in your states, your districts, and your communities.

Some of you are probably saying: “But I thought this work was about RESEARCH? Doesn’t the ‘L’ in ‘REL’ imply that we are out to experiment, test, and discover? Not ACTION?”

The answer is, of course, yes: Research is core to the distinctiveness of the REL program. Research, and a reliance on evidence in classroom practice and policymaking is at the foundation of everything that we do. And yes, in all of our work, we hope to inspire among our partners a desire and capacity to better use data and evidence.

But it cannot end there. The research that we do together must be in service of the action – of the change – around which you have invited us into your work. It must be part of a larger, coherent effort to improve the achievement of all students. Research is a means to an end, but it is not the end this program is meant to achieve.

I would offer one word of caution. This is not just, or even mostly, about improving dissemination. It isn’t about a better tweet, a better infographic, or a better video. We cannot be in the business of just putting research in peoples’ hands and expecting change.

Instead, this is about being in active partnership with you. And putting that relationship to work so that what we know and what we are learning can support the policy, program, and practice goals you have set to support all students.

I do not believe this is a radical departure from how this community thinks about its work. But it may call us to do our work with a different kind of intentionality.

So my ask of you, my charge to you, is that as Governing Board members and stakeholders you consistently challenge us to leverage the research, evaluation, and technical assistance skills of the REL staff in this room in ways that make a real difference in the lives of the learners you serve. Thank you being good partners with us on this journey. As always, please feel free to reach out to me directly if you have thoughts about how we might do our work better.

Increasing Access to Education Research

by Erin Pollard, ERIC Project Officer

IES funds approximately $237 million of research a year in order to provide practitioners and policymakers with the information they need to improve education. But what good is that research if it sits behind a paywall and the people who need it the most can’t access it? That is the thought behind IES’s Public Access Policy. It requires all of our grantees and contractors to make the full text of any peer reviewed work that we funded freely available through ERIC within a year of publication.

Before the policy was adopted in 2011, the majority of the work funded by the National Center for Education Research and National Center for Special Education Research appeared in peer reviewed journals. These journals are largely subscription based; meaning only those who had access to a library with a subscription could access the articles. Given that journal subscriptions are frequently over $1,000 per journal per year and annual subscriptions for academic databases start at close to $10,000 per year, many smaller schools and districts simply cannot afford to purchase access to high quality research. Their teachers and administrators must rely on freely available resources.

IES wanted to change the model. We believe that because IES-supported research is publicly funded, the results should be publicly available. We have worked with publishers, editors, and grantees to find a way to make our policy mutually beneficial.

We are not alone in this effort. IES is part of a larger federal initiative to make most research findings publicly available. Federal agencies are coordinating to adopt similar policies all across the government to increase the availability of good science and to improve evidence-based decision making.

Since we adopted the policy 5 years ago we have been able to make more than 600 publications freely available in ERIC and will be able to release 250 more in the next year. This is just the beginning. We expect more and more work to become available each year. Because ERIC powers other search engines and academic databases with its metadata, we are disseminating these full text articles widely, wherever users are looking for their research. By giving teachers, administrators and policymakers access to high quality research, we are able to get our work into the hands of the people who can use it to build a brighter future for our Nation’s students.

Sharing strategies to increase research-based educational practices

By Cora Goldston, REL Midwest

Highlighted Resources

How can states, districts, and schools identify effective practices to address challenges and achieve their goals? Education research can point the way, but sometimes finding and accessing relevant research can be a frustrating and time-consuming process. And even when practitioners can find research, it can be difficult to determine a study’s rigor and the strength of research evidence supporting interventions.

Equipping practitioners to use research evidence

Through the Midwest Alliance to Improve Knowledge Utilization (MAIKU), the Regional Educational Laboratory (REL) Midwest is partnering with practitioners to help states, districts, and schools use research to inform practice. The goal is to make it easier for educators to find research relevant to their priorities, assess the level of evidence that supports potential practices, and implement those practices that are based on strong evidence.

REL Midwest and MAIKU are supporting the use of research in education practice in several ways. For example, REL Midwest provided coaching sessions for the Ohio Department of Education (ODE) on understanding the Every Student Succeeds Act (ESSA) tiers of evidence. In addition, REL Midwest created a crosswalk that shows how the ESSA evidence tiers align with ratings from research clearinghouses, such as the What Works Clearinghouse. In turn, ODE is using this information to help Ohio districts that are applying for Striving Readers grants. To receive the grants, districts must demonstrate that they plan to use research-based practices to improve student literacy. As a result of REL Midwest’s support, ODE has strengthened its capacity to help districts determine the level of evidence supporting certain practices and, thus, to submit stronger grant applications.

REL Midwest is providing similar support across the region. In Michigan, we are conducting coaching sessions for the state Department of Education to help agency leadership choose priorities from the state’s Top 10 in 10 plan, identify research-based practices that support those priorities, and collaborate to implement new state-level practices. In Wisconsin, REL Midwest hosted a training series for the Department of Public Instruction to increase the agency’s capacity to collect, analyze, and use data to adjust state-level policies and practices. And in Illinois, REL Midwest is holding a training series for the State Board of Education on research methods, data collection, and data analysis and how to use the findings to inform agency practices.

June webinar on increasing evidence use

MAIKU is also working with researchers to support evidence use in education practice. On June 19, 2018, REL Midwest and MAIKU hosted a webinar to discuss how researchers can share evidence with practitioners in useful and accessible ways.

The webinar featured a presentation by Alan J. Daly, Ph.D., of the University of California at San Diego, and Kara Finnigan, Ph.D., of the University of Rochester. Dr. Daly and Dr. Finnigan discussed how information-sharing networks are structured among school and district staff and the challenges for practitioners in accessing and using research-based practices.   

Building on this context, Dr. Daly and Dr. Finnigan shared insights about the most effective ways to maximize the reach of research. One of their key findings is that the pattern of people’s social ties makes a difference for sharing and using research-based practices. Finnigan and Daly noted that the set of relationships we have can increase access to research evidence if the right ties are present but can constrain access to resources when those ties are not present. The quality of relationships also matters; high levels of trust are essential for more in-depth exchanges of information. The takeaway: fostering both the quantity and quality of social relationships is important for sharing research evidence.  

During the webinar, Jaime Singer, senior technical assistance consultant at REL Midwest, also shared actionable strategies that researchers can use to support evidence use in practice, including training and coaching sessions, checklists, blog posts, and clearinghouses of effective practices.

The webinar included a panel discussion about REL Midwest’s ESSA evidence tiers coaching sessions and crosswalk for ODE. REL Midwest researcher Lyzz Davis, Ph.D., provided a researcher perspective on developing resources to meet ODE’s needs. Heather Boughton, Ph.D., and Melissa Weber-Mayrer, Ph.D., at ODE provided practitioner perspectives on how REL Midwest’s work has strengthened the agency’s capacity to help districts find and use evidence-based interventions.

Looking for evidence outside of the scope of the WWC?

by Chris Weiss and Erin Pollard, What Works Clearinghouse

The What Works Clearinghouse (WWC) strives to be a central and trusted source of research evidence for what works in education. But did you know that the WWC is one of several repositories of evidence produced by the federal government? Our mission at the WWC is to review the existing research on different programs, products, practices, and policies in education to provide educators with the information they need to make evidence-based decisions. However, there are several other government repositories that review evidence on interventions that impact children and schools, reviews that may be of use and interest to WWC users.

 

Different Clearinghouses for Different Needs.

The mission of the different clearinghouses and the reasons for different reviews stems from the unique mission of each agency and the specific focus of the clearinghouse. The Department of Education focuses primarily on prekindergarten through postsecondary education; however, many public health and crime prevention programs are implemented through schools. So, for example, you would find information about a school-based bullying prevention program on the National Institute of Justice’s Crime Solutions website. The WWC would not review the evidence of this program’s effectiveness because its aim is to reduce bullying and victimization, rather than education-focused outcomes.

 

Some interventions are reviewed by multiple clearinghouses.

Users are often surprised that an intervention might be reviewed by multiple clearinghouses. For example, the WWC reviewed the evidence and created an intervention report on Career Academies, a school-within-school program where students take both career-related and academic courses, as well as acquire work experience. But reviews of the program are included in other clearinghouses. The Department of Labor’s CLEAR reviewed the study because of the intervention’s increase of student’s earnings. Similarly, the National Institute of Justice’s Crime Solutions has reviewed the intervention because it showed an effect on increasing earnings of young men – an economic factor linked to lowered risk of criminal activity. Each clearinghouse looked at different outcomes from the same study to highlight the domains they find most relevant to achieving their goal.

 

Each repository is different. The WWC may be your best bet – or others may fit your needs better.

We encourage users to look at the other clearinghouses to find information on outcomes that are outside of our scope. These sites have a lot of great information to offer. Here is a list of the other repositories for finding evidence:

  • Clearinghouse for Labor Evaluation and Research (CLEAR) – Department of Labor. CLEAR's mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies.
  • Compendium of Evidence-Based Interventions and Best Practices for HIV Prevention - Centers for Disease Control and Prevention. Evidence-Based Interventions and Best Practices in the Compendium are identified by CDC’s Prevention Research Synthesis Project through a series of ongoing systematic reviews. Each eligible intervention is evaluated against explicit a priori criteria and has shown sufficient evidence that the intervention works. Interventions may fall into one or more chapters including: Risk Reduction that includes PrEP-related outcomes and outcomes such as injection drug use, condom use, HIV/STD/Hepatitis infection; Linkage to, Retention in, and Re-engagement in HIV Care that includes outcomes such as entering and staying in HIV care; Medication Adherence that includes outcomes such as adhering to HIV medication and HIV viral load; and the most recently added Structural Interventions that includes outcomes such as HIV testing, social determinants of health, and stigma. Information sheets are available for all identified evidence-based interventions and best practices on the PRS Compendium Website.
  • CrimeSolutions - National Institute of Justice, Department of Justice. The clearinghouse, accessible via the CrimeSolutions.gov website, present programs and practices that have undergone rigorous evaluations and meta-analyses. The site assesses the strength of the evidence about whether these programs achieve criminal justice, juvenile justice, and crime victim services outcomes in order to inform practitioners and policy makers about what works, what doesn't, and what's promising.
  • Evidence Exchange - Corporation for National and Community Service. A digital repository of sponsored research, evaluation reports, and data. These resources focus on national service, volunteering, and civic engagement.
  • Home Visiting Evidence of Effectiveness (HomVEE) – Administration for Children and Families, Department of Health and Human Services. HomVEE provides an assessment of the evidence of effectiveness for home visiting models that target families with pregnant women and children from birth to kindergarten entry (that is, up through age 5).
  • Teen Pregnancy Prevention (TPP) Evidence Review – Department of Health and Human Services. A transparent systematic review of the teen pregnancy prevention literature to identify programs with evidence of effectiveness in reducing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors.
  • The Community Guide - Community Preventive Services Task Force (CPSTF). A collection of evidence-based findings to help you select interventions to improve health and prevent disease in your state, community, community organization, business, healthcare organization, or school. The CPSTF issues findings based on systematic reviews of effectiveness and economic evidence that are conducted with a methodology developed by the CPSTF.
  • youth.gov – Interagency. The youth.gov Program Directory features evidence-based programs whose purpose is to prevent and/or reduce delinquency or other problem behaviors in young people.

What Works in STEM Education: Resources for National STEM Day, 2018

Are you celebrating National STEM Day this November 8th by learning more about how to improve student achievement in Science, Technology, Engineering, and Mathematics (STEM)? If so, the Institute of Education Sciences’ (IES’s) What Works Clearinghouse has great resources for educators who want information about the latest evidence-based practices in supporting learners of all ages.

  • Focused on math? If so, check out Improving Mathematical Problem Solving in Grades 4 Through 8. Based on 38 rigorous studies conducted over 20 years, this practice guide includes five recommendations that teachers, math coaches, and curriculum developers can use to improve students’ mathematical problem-solving skills. There’s strong evidence that assisting students in monitoring and reflecting on the problem-solving process and teaching students how to use visual representations (e.g., tables, graphs, and number lines) can improve achievement. Other practice guides focus on Teaching Math to Young Children and Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students.

  • Don’t worry, we won’t leave science out! Encouraging Girls in Math and Science includes five evidence-based recommendations that both classroom teachers and other school personnel can use to encourage girls to choose career paths in math- and science-related fields. A handy 20-point checklist provides suggestions for how those recommendations can be incorporated into daily practice, such as “[teaching] students that working hard to learn new knowledge leads to improved performance” and “[connecting] mathematics and science activities to careers in ways that do not reinforce existing gender stereotypes of those careers.”

  • Looking for specific curricula or programs for encouraging success in STEM? If so, check out the What Works Clearinghouse’s Intervention Reports in Math and Science. Intervention reports are summaries of findings from high-quality research on a given educational program, practice, or policy. There are currently more than 200 intervention reports that include at least one math or science related outcome. (And nearly 600 in total!)

  • Maybe you just want to see the research we’ve reviewed? You can! The What Works Clearinghouse’s Reviews of Individual Studies Database includes nearly 11,000 citations across a wide range of topics, including STEM. Type in your preferred search term and you’re off—from algebra to zoology, we’ve got you covered!

We hope you’ll visit us on November 8th and learn more about evidence-based practices in STEM education. And with practice guides, intervention reports, and individual studies spanning topics from Early Childhood to Postsecondary education and everything in-between, we hope you’ll come back whenever you are looking for high-quality research to answer the question “what works in education!”

The WWC Evidence Standards: A Valuable and Accessible Resource for Teaching Validity Assessment of Causal Inferences to Identify What Works

by Herbert Turner, Ph.D., President and Principal Scientist, ANALYTICA, Inc.

 

The WWC Evidence Standards (hereafter, the Standards) provide a detailed description of the criteria used by the WWC to review studies. The standards were first developed in 2002 by leading methodological researchers using initial concepts from the Study Design and Implementation Assessment Device (DIAD), an instrument for assessing the correspondence between the methodological characteristics and implementation of social science research and using this research to draw inferences about causal relationships (Boruch, 1997; Valentine and Cooper, 2008).  During the past 16 years, the Standards have gone through four iterations of improvement, to keep pace with advances in methodological practice, and have been through rigorous peer review. The most recent of these is now codified in the WWC Standards Handbook 4.0 (hereafter, the Handbook).

 

Across the different versions of the Handbook, the methodological characteristics of an internally valid study, designed to causally infer the effect of an intervention on an outcome, have stood the test of time. These characteristics can be summarized as follows: A strong design starts with how the study groups are formed. It continues with use of reliable and valid measures of outcomes, has low attrition if a randomized controlled trial (RCT), shows baseline equivalence (in the analysis sample) if a quasi-experimental design (QED), and has no confounds.

 

These elements are the critical components of any strong research design – and are the cornerstones of all versions of the WWC’s standards. That fact, along with the transparent description of their logical underpinning, is what motivated me to use Standards 4.0 (for Group Designs) as the organizing framework for understanding study validity in a graduate-level Program Evaluation II course I taught at Boston College’s Lynch School of Education.

 

In spring 2017, nine Master and four Doctoral students participated in this semester-long course. The primary goal was to teach students how to organize their thinking and logically derive internal validity criteria using Standards 4.0—augmented with additional readings from the methodological literature. Students used the Standards (along with the supplemental readings) to design, implement, analyze, and report impact evaluations to determine what interventions work, harm, or have no discernible effect (Mosteller and Boruch, 2002). The Standards Handbook 4.0 along with online course modules were excellent resources to augment the lectures and provide Lynch School students with hands on learning.

 

At the end of the course, students were offered the choice to complete the WWC Certification Exam for Group Design or take the instructor’s developed final exam. All thirteen students chose to complete the WWC Certification Exam. Approximately half of the students became certified. Many emailed me personally to express their appreciation for the (1) opportunity to learn a systematic approach to organizing their thinking about assessing the validity of causal inference using data generated by RCTs and QEDs, and (2) developing design skills that can be used in other graduate courses and beyond. The WWC Evidence Standards and related online resources are a valuable, accessible, and free resource that have been rigorously vetted for close to two decades. The Standards have few equals as a resource to help students think systematically, logically, and clearly about designing (and evaluating) a valid research study to make causal inferences about what interventions work in education and related fields.

 

References

Boruch, R. F. (1997). Randomized experiments for planning and evaluation: A practical guide. Thousand Oaks, CA: Sage Publications.

Valentine, J.C., & Cooper, H. (2008), A systematic and transparent approach for assessing the methodological quality of intervention effectiveness research: The Study Design and Implementation Assessment Device (Study DIAD). Psychological Methods, 13(2), 130-149.

Mosteller, F., & Boruch, R. F. (2002). Evidence matters: Randomized trials in education research. Washington, D.C.: Brookings Institution Press.

Making the WWC Open to Everyone by Moving WWC Certification Online

In December 2016, the What Works Clearinghouse made a version of its online training publicly available through the WWC Website. This enabled everyone to be able to access the Version 3.0 Group Design Standards reviewer training to learn about the standards and methods that the WWC uses. While this was a great step to increase access to WWC resources, users still had to go through the 1 ½ day, in-person training to become a WWC certified reviewer.

To continue our efforts to promote access and transparency and make our resources available to everyone, the WWC has now moved all of its group design training to be online. Now everyone will have access to the same training and certification tests. This certification is available free of charge and is open to all users. It is our hope that this effort will increase the number of certified reviewers and help increase general awareness about the WWC.

Why did the WWC make these resources publicly available? As part of IES’s effort to increase access to high quality education research, we wanted to make it easier for researchers to use our standards. This meant opening up training opportunities and offering training online was a way to achieve this goal while using limited taxpayer resources most efficiently.

The online training consists of 9 modules. These videos feature an experienced WWC instructor and use the same materials that we used in our in-person courses, but adapted to Version 4.0 of the Group Design Standards. After completing the modules, users will have the opportunity to download a certificate of completion, take the online certification test, or go through the full certification exam.

Becoming a fully certified reviewer will require users to take a multiple choice online certification test and then use the new Online SRG application to conduct a full review using the same tools that the WWC team uses. The WWC team will then grade your exam to make sure you fully understand how to apply the Standards before certifying you to review for the Clearinghouse.

Not interested in becoming a certified reviewer? Online training still has several benefits. Educators can embed our videos in their course websites and use our training materials in their curricula. Researchers can use our Online SRG tool with their publications to determine a preliminary rating and understand what factors could cause their study to get the highest rating. They could also use the tool to use when conducting a systematic evidence review.

Have ideas for new resources we could make available? Email your ideas and suggestions to Contact.WWC@ed.gov!

by Erin Pollard, WWC Project Officer

 

Improving the WWC Standards and Procedures

By Chris Weiss and Jon Jacobson

For the What Works Clearinghouse (WWC), standards and procedures are at the foundation of the WWC’s work to provide scientific evidence for what works in education. They guide how studies are selected for review, what elements of an effectiveness study are examined, and how systematic reviews are conducted. The WWC’s standards and procedures are designed to be rigorous and reflective of best practices in research and statistics, while also being aspirational to help point the field of education effectiveness research toward an ever-higher quality of study design and analysis.

To keep pace with new advances in methodological research and provide necessary clarifications for both education researchers and decision makers, the WWC regularly updates its procedures and standards and shares them with the field. We recently released Version 4.0 of the Procedures and Standards Handbooks, which describes the five steps of the WWC’s systematic review process.

For this newest version, we have divided information into two separate documents (see graphic below).  The Procedures Handbook describes how the WWC decides which studies to review and how it reports on study findings. The Standards Handbook describes how the WWC rates the evidence from studies.

The new Standards Handbook includes several improvements, including updated and overhauled standards for cluster-level assignment of students; a new approach for reviewing studies that have some missing baseline or outcome data; and revised standards for regression discontinuity designs. The new Procedures Handbook includes a revised discussion of how the WWC defines a study.  All of the changes are summarized on the WWC website (PDF).

Making the Revisions

These updates were developed in a careful, collaborative manner that included experts in the field, external peer review, and input from the public.

Staff from the Institute of Education Sciences oversaw the process with the WWC’s Statistical, Technical, and Analysis Team (STAT), a panel of highly experienced researchers who revise and develop the WWC standards. In addition, the WWC sought and received input from experts on specific research topics, including regression discontinuity designs, cluster-level assignment, missing data, and complier average causal effects. Based on this information, drafts of the standards and procedures handbooks were developed.

External peer reviewers then provided input that led to additional revisions and, in the summer, the WWC posted drafts and gathered feedback from the public. The WWC’s response to some of the comments is available on its website (PDF).   

Version 4.0 of the Handbooks was released on October 26. This update focused on a few key areas of the standards, and updated and clarified some procedures. However, the WWC strives for continuous improvement and as the field of education research continues to evolve and improve, we expect that there will be new techniques and new tools incorporated into future versions the Handbooks.

Your thoughts, ideas, and suggestions are welcome and can be submitted through the WWC help desk.

Why Can’t You Just Use Google Instead of ERIC?

By Erin Pollard, ERIC Program Officer

The Education Resources Information Center (ERIC) provides the public with free, online access to a scholarly database of education research. We are frequently asked why the government sponsors such a tool when people can use Google or a subscription-based scholarly database.  

Commercial search engines and scholarly databases are important, but would not function as efficiently without ERIC’s metadata to power their search engines. Because of the costs associated with indexing, commercial and scholarly search engines would likely prioritize the work from major publishers, and may not index the work from small publishers on a regular basis.

But ERIC has built national and global relationships with key publishers, research centers, government entities, universities, education associations, and other organizations to disseminate their materials. We are currently under agreement with 1,020 different publishers, many of whom are small and only publish a single journal or report series.

For more than 50 years, ERIC has been acquiring grey literature (e.g., reports from the Institute of Education Sciences (IES) and other government reports, white papers, and conference papers) and making it centrally available and free-of-charge to the public. Therefore, an ERIC user is just as likely to find a relevant conference paper from a smaller publisher as they are to find a journal article from a major publisher. (See infographic (PDF) above to learn more about who uses ERIC)

ERIC also ensures that all records indexed meet a set of quality guidelines before indexing, and provides tools, such as a peer-review flag, that can help users evaluate the quality of the material. Underlying all of ERIC’s records are a set of metadata that helps guide users to the resources they are seeking. The metadata also includes descriptors from ERIC’s Thesaurus, a widely recognized, controlled vocabulary of subject-specific tags in the education field. Descriptors are added to each record and used by search engines to pinpoint results.

Lastly, and most importantly, ERIC provides access to more than 380,000 full-text resources, including journal articles and grey literature and makes these articles available for perpetuity. ERIC has been around for more than 50 years and has collected materials in hard copy, microfiche, and PDF. These materials are publicly available even after organizations or journals cease operations or redesign their website in a way that makes materials no longer available. In any given month, over 25% of ERIC’s new records are peer reviewed and provide free full text. Additionally, about 4% of journals provide peer-reviewed full text after an embargo. This includes work from IES grantees that normally appears in journals behind a paywall, but ERIC can make available through the IES Public Access Policy.

ERIC’s comprehensive collection, metadata, and access to full text articles make it an important resource for researchers, students, educators, policy makers and the general public. 

Want to learn more about ERIC? Watch this short video introduction or check out our multimedia page for access to other videos, infographics, and webinars.  

Updating Our Recommendations to Prevent Dropping Out

By Dana Tofig, Communications Director, IES

Almost a decade ago, the What Works Clearinghouse (WWC) released Dropout Prevention, one of its first practice guides, which offered six recommendations for keeping students in school, based on the findings from high-quality research and best practices in the field. Since its release in August 2008, the practice guide has been downloaded thousands of times by practitioners across the country. In fact, the guide was downloaded more than 1,500 times in 2016, alone.

However, over the past decade, the research and knowledge base has grown in the area of dropout prevention, which is why the WWC decided to update this guide to reflect the latest evidence and information about keeping students in school and on track toward graduation.

This updated guide, Preventing Dropout in Secondary Schools, builds on the 2008 guide in two significant ways.

First, it reflects improvements in practices related to monitoring at-risk students, including advances in using early warning indicators to identify students at risk for dropping out. Secondly, it covers an additional nine years of research that were not a part of the previous guide. In fact, 15 of the 25 studies used to support the recommendations in this updated guide were published after the first guide was published. In addition, studies from the previous guide were reviewed again against current, more rigorous WWC evidence standards.

Preventing Dropout in Secondary Schools offers four evidence-based recommendations that can be used by schools and districts:

  • Monitor the progress of all students, and proactively intervene when students show early signs of attendance, behavior, or academic problems;
  • Provide intensive, individualized support to students who have fallen off track and face significant challenges to success;
  • Engage students by offering curricula and programs that connect schoolwork with college and career success and that improve students’ capacity to manage challenges in and out of school; and
  • For schools with many at-risk students, create small, personalized communities to facilitate monitoring and support.

Each of these recommendations includes specific strategies for implementation and examples of how this work is being done around the country (see one such example in the image to the right).

Like all of our practice guides, the recommendations were developed with a panel of educators, academics, and experts who brought a wealth of knowledge and experience to the process. Two of the panelists on this updated guide were also involved in the development of the first dropout prevention guide—Russell Rumberger, from University of California, Santa Barbara, and Mark Dynarksi, of Pemberton Research LLC. The other panelists for the new guide were Howard (Sandy) Addis, of the National Dropout Prevention Center and Network; Elaine Allensworth, from the University of Chicago; Robert Balfanz, from The Johns Hopkins University; and Debra Duardo, superintendent of the Los Angeles County Office of Education.

Please download this free guide today and let us know what you think. We would especially love to hear from people who are using the recommended strategies in schools. You can reach us through the WWC Help Desk or email us at Contact.IES@ed.gov