NCEE Blog

National Center for Education Evaluation and Regional Assistance

Gearing up for the Fall Semester? See How ERIC Can Help!

By Erin Pollard, ERIC Project Officer, NCEE

The ERIC Help Desk frequently receives questions from academic users asking if we have resources to help students and faculty use the Institute of Education Sciences' free online library of education research. In response, ERIC has created several videos that can be embedded in LibGuides (a content management system used by libraries), linked to in syllabi, and posted on course websites. These videos can also be accessed from ERIC’s Multimedia Page.

Our newest videos include:

About ERIC: The About ERIC video is an introduction to our digital library and answers basic questions about who our website is designed for and what types of information you’ll find there. Users can view this video to determine if ERIC is the right resource to use for their research.

Using ERIC to Write a Research Paper: The ERIC Help Desk has received a lot  of requests from students asking how to use ERIC to write a research paper.  This video was specifically created to be a resource that librarians and faculty members could direct students to as a starting point for research projects. For more information on this video, stay tuned for next week’s blog post!

Searching eric.ed.gov: ERIC was redesigned in 2013 using intuitive search technology, as opposed to more traditional searching mechanisms. This video explains how our search engine works and offers tips to help users get the best results.

How ERIC Selects New Sources: Users frequently ask how sources are selected to be indexed in ERIC. This video describes the process laid out in the ERIC Selection Policy and explains how users can recommend new resources for our collection.

IES Public Access Policy: How Grantees and Contractors Meet Requirements by Submitting Work to ERIC: Are you conducting research funded by IES? If so, then you are likely subject to the IES Public Access Policy (learn more about the policy for publications and data).  This video explains how researchers can meet their compliance requirements by submitting their work to ERIC.

Tips for Using the ERIC Online Submission System:  ERIC encourages authors to submit their work through our Online Submission System. Learn what types of materials are eligible to be included and how to prepare publications for submission to ERIC.

Have suggestions for other videos ERIC should make? Email our Help Desk! To learn more about ERIC, sign up for our Newsflash and follow us on Facebook and Twitter.

 

 

The What Works Clearinghouse Goes to College

By Vanessa Anderson, Research Scientist, NCEE

The What Works Clearinghouse (WWC) was founded in 2002 and, in its first decade, focused mainly on reviewing studies of programs, policies, products and practices—or interventions—for improving student outcomes in pre-K, elementary and secondary schools. But in 2012, the WWC broadened its focus and has been using rigorous standards to review studies of interventions designed to increase the success of students in postsecondary education.

This week, the WWC launches a new topic—Supporting Postsecondary Success—and it is a good time to look at the work we’re doing, and will do, in the postsecondary area. 

The WWC postsecondary topic area includes reviews of studies on a wide range of interventions, including learning communities, summer bridge programs, multi-faceted support programs, academic mentoring, and interventions that aim to reduce performance anxiety. As of today, 294 postsecondary studies have been reviewed by the WWC. Those reviews are summarized in six Intervention Reports, 25 Single Study Reviews, and four Quick Reviews. And there’s much more in the works!  For instance, a WWC Educator’s Practice Guide that includes strategies for supporting students in developmental education is planned for publication later this year. (Learn more about Practice Guides)

Identifying Studies for Review

In the postsecondary topic area, there are currently three main ways that studies are identified by the WWC for review.

The first is studies that are reviewed for WWC Intervention Reports. All WWC Intervention Reports use a systematic review process to summarize evidence from all available studies on a given intervention. The WWC conducts a broad search for all publicly available studies of interventions that are related to the topic. This process often identifies hundreds of studies for review. The effectiveness studies are then reviewed against WWC standards. Only the highest quality studies are summarized in an Intervention Report.

We released two new intervention reports this week as part of our new Supporting Postsecondary Success topic. You can view the new Intervention Reports on Summer Bridge programs and first-year experience courses on the WWC website.

The second way that studies are reviewed by the WWC is through Quick Reviews, which are performed on studies that have received a great deal of media attention. In these reports, the WWC provides a brief description of the study, the author-reported results, and a study rating. We like to think of Quick Reviews as a way to help people decide whether to fully believe the results of a study, based on the research design and how the study was conducted. For example, we released a quick review earlier this month that focused on a study of computer usage and student outcomes for a class at the U.S. Military Academy at West Point.

Finally, the WWC reviews postsecondary studies submitted as supporting evidence for discretionary grant competitions funded by the U.S. Department of Education, such as the Strengthening Institutions Program, First in the World and TRIO Student Support Services. These grant competitions require applicants to submit studies as evidence of the effectiveness of the interventions they propose to implement. The WWC reviews these studies and includes the results of those reviews in our database.

If you want to see all the studies on postsecondary interventions that have been reviewed by WWC you can check out—and download—the Reviewed Studies Database. In the “Topic Areas” dropdown menu, just select “Postsecondary,” and then easily customize the search by rating, publication type, and/or reasons for the review (such as a grant competition).  

For more information, visit the WWC postsecondary topic area on the website. To stay up-to-date on WWC news, information, and products, follow us on Facebook, Twitter and sign up for the WWC newsflash!

Five Reasons to Visit the What Works Clearinghouse

By Diana McCallum, Education Research Analyst, What Works Clearinghouse

It’s been more than a decade since the first What Works Clearinghouse reports were released and we have a wealth of information and resources that can help educators and leaders make evidence-based decisions about teaching and learning. Since 2005, the WWC has assessed more than 11,500 education studies using rigorous standards and has published hundreds of resources and guides across many content areas. (View the full version of the graphic to the right.) 

The WWC website has already received more than 1.7 million page views this year, but if you haven’t visited whatworks.ed.gov lately, here are five reasons you might want to click over:

1) We are always adding new and updated reviews. Multiple claims about programs that work can be overwhelming and people often lack time to sift through piles of research. That’s where the WWC comes in. We provide an independent, objective assessment of education research. For example, we have intervention reports that provide summaries of all of the existing research on a given program or practice that educators can use to help inform their choices.  In addition, when a new education study grabs headlines, the WWC develops a quick review that provides our take on the evidence presented to let you know whether the study is credible. In 2015, we added 43 publications to WWC and we’re adding more every month this year.

2) We’ve expanded our reach into the Postsecondary area. In late 2012, the WWC expanded its focus to include reviews of studies within the Postsecondary area to capture the emerging research on studies on a range of topics, from the transition to college to those that focus on postsecondary success.  To date, the WWC has reviewed over 200 studies on postsecondary programs and interventions, and this area continues grow rapidly. In fact, several Office of Postsecondary Education grant competitions add competitive priority preference points for applicants that submit studies that meet WWC standards. (Keep an eye out for a blog post on the postsecondary topic coming soon!)

3) You can find what works using our online tool. Wondering how to get started with so many resources at your fingertips? Find What Works lets you do a quick comparison of interventions for different subjects, grades, and student populations. Want to know more about a specific intervention? We’ve produced more than 400 intervention reports to provide you the evidence about a curriculum, program, software product, or other intervention for your classroom before you choose it.  Recently, we’ve added a feature that allows a user to search for interventions that have worked for different populations of students and in different geographic locations. As we mentioned in a recent blog post, the Find What Works tool is undergoing an even bigger transformation this September, so keep visiting!

4) We identify evidence-based practices to use in the classroom. The WWC has produced 19 practice guides that feature practical recommendations and instructional tips to help educators address common challenges. Practice guides (now available for download as ebooks) provide quick, actionable guidance for educators that are supported by evidence and expert knowledge within key areas.  Some of our guides now feature accompanying videos and brief summaries that demonstrate recommended practices and highlight the meaning behind the levels of evidence. The work of practice guides are also actively disseminated during Regional Educational Laboratory (REL) Bridge events. For instance, REL Southwest held a webinar on Teaching Math to Young Children, which was based on a WWC practice guide. For more information, read a previously published blog post on practice guides.

5) We compile information by topic. Our “Special Features” pages focus on common themes in education, such as tips for college readiness, information for heading back to school, and guidance for what works in early childhood education. These Special Features provide a starting point to access a variety of WWC resources related to a topic.

In the coming months, we’ll post other blogs that will explore different parts of the WWC and tell you about ongoing improvements. So keep visiting the What Works website or signup to receive emails when we release new reports or resources. You can also follow us on Facebook and Twitter.

The What Works Clearinghouse is a part of the National Center for Education Evaluation and Regional Assistance in the Institute of Education Sciences (IES), the independent research, evaluation, and statistics arm of the U.S. Department of Education. You can learn more about IES’ other work on its website or follow IES on Twitter and Facebook

 

VIDEO: See How ERIC Selects New Sources

By Erin Pollard, ERIC Project Officer, NCEE

ERIC builds a strong education research collection by continuously seeking out new sources of rigorous content and adding them to the collection. But how does ERIC select publications for the online library?

A new video (embedded below) provides the answer to how ERIC selects new sources, including education-focused journals, grey literature reports, and conference papers. The video was developed to help answer one of the most frequently asked questions by ERIC users and to help publishers and organizations producing materials in the field of education understand what ERIC considers when evaluating potential new sources. Watch this video if you want to learn about the types of resources ERIC will and will not index, the source selection process, and how to recommend a new resource.

Twice a year, in the spring and fall, ERIC reviews journals and producers of conference papers, reports, and books as potential candidates for inclusion in ERIC, using a revised selection policy as a guide when evaluating recommended content. The revised policy was released in January 2016 to clarify the types of materials ERIC is seeking for the collection. ERIC considers resources that are education research focused and include citations, orginal analyses of data, and well-formed arguments. ERIC also considers collection priorities, such as peer- reviewed and full-text materials.

We are continuously working to build a strong education research collection that includes the latest and very best resources in the field. If you are a publisher of high-quality education research, or if you have a favorite journal, or know a source of conference papers or reports not currently in ERIC, please send us your recommendations.

To stay up-to-date on ERIC, follow us on FacebookYouTube, or Twitter and check out the Notes area of the ERIC website at eric.ed.gov

Sustaining School Improvement

By Thomas Wei, Evaluation Team Leader, NCEE

NOTE: In an effort to turn around the nation’s chronically low-performing schools, the Department of Education injected more than $6 billion into the federal School Improvement Grants (SIG) program over the past several years. SIG schools received a lot of money for a short period of time—up to $6 million over three years—to implement a number of prescribed improvement practices.

What is the prognosis for low-performing schools now that many federal School Improvement Grants (SIG) are winding down? This is an important question that the National Center for Education Evaluation and Regional Assistance (NCEE) addressed through its Study of School Turnaround

The second and final report from this study was released on April 14 and describes the experiences of 12 low-performing schools as they implemented SIG from 2010 to 2013 (Read a blog post on the first report). Findings are based on analyses of teacher surveys and numerous interviews with other school stakeholders, such as district administrators, principals, assistant principals, members of the school improvement team, instructional coaches, and parents.

After three years trying a diverse array of improvement activities ranging from replacing teachers to extending learning time to installing behavioral support systems, most of the 12 schools felt they had changed in primarily positive ways (see chart below from report).

The report also found that schools with lower organizational capacity in the first year of SIG appeared to boost their capacity by the final year of SIG. At the same time, schools with higher capacity appeared generally able to maintain that capacity.

Many experts believe that organizational capacity is an important indicator of whether a low-performing school can improve (see chart below showing schools with higher organizational capacity also appeared more likely to sustain improvements). Organizational capacity is indicated by for example, how strong a leader the principal is, how consistent school policies are with school goals, how much school leaders and staff share clear goals, how much collaboration and trust there is among teachers, and how safe and orderly the school climate is.

Despite these promising results, the report found that the overall prospects for sustaining any improvements appeared to be fragile in most of these 12 schools. The report identified four major risk factors, including (1) anticipated turnover or loss of staff; (2) leadership instability; (3) lack of district support, particularly with regard to retaining principals and teachers; and (4) loss of specific interventions such as professional learning or extended day programs. Most of the case study schools had at least one of these major risk factors, and a number of schools had multiple risk factors.

It is important to note that this study cannot draw any causal conclusions and that it is based on surveys and interviews at a small number of schools that do not necessarily reflect the experiences of all low-performing schools. Still, it raises interesting questions for policymakers as they consider how best to deploy limited public resources in support of future school improvement efforts that will hopefully be long-lasting.

NCEE has a larger-scale study of SIG underway that is using rigorous methods to estimate the impact of SIG on student outcomes. The findings from the case studies report released last week may yield important contextual insights for interpreting the overall impact findings. These impact findings are due out later this year, so stay tuned.

How to Help Low-performing Schools Improve

By Thomas Wei, Evaluation Team Leader

NOTE: Since 2009, the Department of Education has invested more than $6 billion in School Improvement Grants (SIG)SIG provided funds to the nation’s persistently lowest-achieving schools to implement one of four improvement models. Each model prescribed a set of practices, for example: replacing the principal, replacing at least 50 percent of teachers, increasing learning time, instituting data-driven instruction, and using “value-added” teacher evaluations.

Other than outcomes, how similar are our nation’s low-performing schools? The answers to this question could have important implications for how best to improve these, and other, schools. If schools share similar contexts, it may be more sensible to prescribe similar improvement practices than if they have very different contexts.

This is one of the central questions the National Center for Education Evaluation and Regional Assistance is exploring through its Study of School Turnaround. The first report (released in May 2014) described the experiences of 25 case study schools in 2010-2011, which was their first year implementing federal School Improvement Grants (SIG).

The report found that even though the 25 SIG schools all struggled with a history of low performance, they were actually quite different in their community and fiscal contexts, their reform histories, and the root causes of their performance problems. Some schools were situated in what the study termed “traumatic” contexts, with high crime, incarceration, abuse, and severe urban poverty. Other schools were situated in comparatively “benign” contexts with high poverty but limited crime, homes in good repair, and little family instability. All schools reported facing challenges with funding and resources, but some felt it was a major barrier to improvement while others felt it was merely a nuisance. Some schools felt their problems were driven by student behavior, others by poor instruction or teacher quality, and still others by the school’s external context such as crime or poverty.

Given how diverse low-performing schools appear to be, it is worth wondering whether they need an equally diverse slate of strategies to improve. Indeed, the report found that the 25 case study schools varied in their improvement actions even with the prescriptive nature of the SIG models (see the chart above, showing school improvement actions used by sample schools).

It is important to note that this study cannot draw any causal conclusions and that it is based on a small number of schools that do not necessarily reflect the experiences of all low-performing schools. Still, policymakers may wish to keep this finding in mind as they consider how to structure future school improvement efforts.

The first report also found that all but one of the 25 case study schools felt they made improvements in at least some areas after the first year of implementing SIG. Among the issues studied in the second report, released April 14, 2016, is whether these schools were able to build on their improvements in the second and third year of the grant. Read a blog post on the second report.

UPDATED APRIL 18 to reflect release of second report.

Responding to the Needs of the Field

By Chris Boccanfuso Education Research Analyst, NCEE

One of the most commonly asked questions about the Regional Educational Laboratory, or REL, program is how we choose the applied research and evaluation studies, analytic technical assistance, and dissemination activities we provide for free to stakeholders every year. The answer is simple – we don’t!

Instead, the REL staff at the Institute of Education Sciences (IES) and our contractors who run the nation’s ten regional labs listen to the voices of teachers, administrators, policymakers, and students to identify and address high-leverage problems of practice and build the capacity of stakeholders. In other words, these groups determine where the RELs should use their resources to make the largest, most lasting impact possible on practice, policy and, ultimately, student achievement.

How do the RELs do this? Through a variety of activities we collectively refer to as needs sensing. The following are a few examples of how the RELs engage in needs sensing:

Research Alliances: Research Alliances are a type of researcher–practitioner partnership where a group of education stakeholders convene around a specific topic of concern to work collaboratively to investigate a problem and build capacity to address it. Alliances can be made up of many types of stakeholders, such as teachers, administrators, researchers and members of community organizations. Alliances can vary in size and scope and address a variety of topics. For instance, alliances have formed to address topics as broad as dropout prevention and as specific as Hispanic students’ STEM performance. The vast majority of the RELs’ work is driven by these research alliances.

While the RELs’ 79 research alliances are incredibly diverse, one thing each alliance has in common is that they collectively develop a research agenda. These agendas can change, as the alliance continually weighs the questions and needs of various groups against the types of services and the resources available to address these needs. Not every need has to be addressed through a multi-year research study. Sometimes, it can be addressed through a workshop, a literature review, or a “Bridge Event”, where national experts on certain topic work with practitioners to provide the information that alliance members need, when they need it. Sometimes, a need is state or district-specific, is related to the impact of a specific program, or covers a topic where the existing research literature is thin. In these cases, a research study may be most appropriate.

Governing Boards: Another way that RELs determine their work is through their Governing Boards.  By law, each REL is required to have a Governing Board that consists of the Chief State School Officers (or their designee) for each state, territory, or freely associated state in the region. The Board also includes carefully selected members who equitably represent each state, as well as a broad array of regional interests, such as educating rural and economically disadvantaged populations. (A 2013 REL Northeast and Islands Governing Board meeting is pictured here.)
 
Governing Boards typically include a mix of people with experience in research, policy and teaching practice. Each Governing Board meets two to three times per year to discuss, direct, advise, and approve each REL project that occurs in that region. The intent is to ensure that the work being done by the REL is timely, high-leverage, equitably distributed across the region, and not redundant with existing efforts.                          

“Ask a REL”: A third way in which the RELs engage in needs sensing is through the Ask a REL service. Ask a REL is a publicly available reference desk service that functions much like a technical reference library. It provides requestors with references, referrals, and brief responses in the form of citations on research-based education questions. RELs are able to examine trends in the topics of Ask a REL requests to verify needs determined through other methods, as well as identify new topics that may warrant additional attention.   

RELs use many additional ways to explore the needs of their region, including scans of regional education news sites, reviews of recently published research, and Stakeholder Feedback Surveys that are filled out by alliance members and attendees at REL events.

It’s a thorough and ongoing process that RELs are engaging in to address authentic, high-leverage problems of practice in a variety of ways. In the coming months, we will share stories of the many projects that were informed by this needs sensing process. Stay tuned!

 

Should ESSA Evidence Definitions and What Works Study Ratings be the Same? No, and Here's Why!

By Joy Lesnick, Acting Commissioner, NCEE

The Every Student Succeeds Act (ESSA), the new federal education law, requires education leaders to take research evidence into account when choosing interventions or approaches. ESSA  defines three “tiers” of evidence—strong, moderate, and promising—based on the type and quality of study that was done and its findings.  

Are the ESSA definitions the same as those of Institute of Education Sciences’ What Works Clearinghouse (WWC)?  Not exactly.  ESSA definitions and WWC standards are more like cousins than twins.

Like ESSA, the WWC has three ratings for individual studies – meets standards without reservations, meets standards with reservations, and does not meet standards. The WWC uses a second set of terms to summarize the results of all studies conducted on a particular intervention. The distinction between one study and many studies is important, as I will explain below.

You may be wondering: now that ESSA is the law of the land, should the WWC revise its standards and ratings to reflect the tiers and terminology described in ESSA?  Wouldn’t the benefit of making things nice and tidy between the two sets of definitions outweigh any drawbacks?

The short answer is no.

The most basic reason is that the WWC’s standards come from a decision-making process that is based in science and vetted through scholarly peer review, all protected by the independent, non-partisan status of the Institute of Education Sciences (IES). This fact is central to the credibility of the WWC’s work.  We like to think of the WWC standards as an anchor representing the best knowledge in the field for determining whether a study has been designed and executed well, and how much confidence we should have in its findings.

WWC Standards Reflect the Most Current Scientific Knowledge – and are Always Evolving

WWC standards were developed by a national panel of research experts. After nearly two years of meetings, these experts came to a consensus about what a research study must demonstrate to give us confidence that an intervention caused the observed changes in student outcomes.

Since the first WWC standards were developed over a decade ago, there have been many methodological and conceptual advances in education research. The good news is that the WWC is designed to keep up with these changes in science. As science has evolved, the WWC standards have evolved, too.

One example is the WWC’s standards for reviewing regression discontinuity (RD) design studies.  The first version of RD standards was developed by a panel of experts in 2012.  Since then, the science about RD studies has made so much progress that the WWC recently convened another panel of experts to update the RD standards. The new RD standards are now on the WWC website to solicit scholarly comment.  

When it Comes to Evidence, More is Better

The evidence tiers in ESSA set a minimum bar, based on one study, to encourage states, districts, and schools to incorporate evidence in their decision making. This is a very important step in the right direction.  But a one-study minimum bar is not as comprehensive as the WWC’s approach.

In science, the collective body of knowledge on a topic is always better than the result of a single study or observation. This is why the primary function of the WWC is to conduct systematic reviews of all of the studies on a program, policy, practice, or approach (the results of which are published in Intervention Reports like the one pictured here).

The results of individual studies are important clues toward learning what works. But multiple studies, in different contexts, with different groups of teachers and students, in different states, and with different real-world implementation challenges tell us much more about how well a program, policy, practice or approach works. And that, really, is what we’re trying to find out.

An Improved WWC Search Tool and Ongoing Support for States and Districts

One area where WWC will make changes is in how users find studies that have certain characteristics described in ESSA’s evidence tiers.  For the past 16 months, the WWC team has been hard at work behind the scenes to develop, code, and user-test a dramatically improved Find What Works tool.  We expect to release this tool, along with other changes to the WWC website, in fall 2016. (More on that in another post, but the picture below offers a sneak preview!)

These changes should further increase the utility of the WWC website, which already gets more than 300,000 hits each month and offers products that are downloaded hundreds of thousands of times each year.

We know that providing information on a website about evidence from rigorous research is just a first step.  States and districts may need additional, customized support to incorporate evidence into their decision-making processes in ways that are much deeper than a cursory check-box approach.

To meet that need, other IES programs are ready to help. For example, IES supports 10 Regional Educational Laboratories (RELs) that provide states and districts with technical support for using, interpreting, and applying research. At least two researchers at every REL are certified as WWC reviewers (meaning they have in-depth knowledge of the WWC standards and how the standards are applied), and every REL has existing relationships with states and districts across the nation and outlying regions. Because the RELs are charged with meeting the needs of their regions, every chief state school officer (or designee) sits on a REL Governing Board, which determines the annual priorities of the REL in that area.

As states prioritize their needs and identify ways to incorporate evidence in their decisions according to the new law, the WWC database of reviewed studies will provide the information they need, and the RELs will be ready to help them use that information in meaningful ways.

 

 

 

Practice Guides: How to Use What Works in the Classroom

By Diana McCallum, NCEE

With new education research released every day, it can be difficult to know which teaching methods and classroom practices have been tested and shown to improve student outcomes. You want to know what really works and how to use evidence-based practices in your school or classroom.

What Works Clearinghouse practice guides help bridge the gap between research and practice by examining the findings from existing research studies and combining them with expert advice about applying these findings in the classroom. For each guide, a team of nationally-recognized practitioners and researchers work closely with the WWC to combine evidence from research with guidance from hands-on experience.

Practice guides offer specific recommendations that include a description of the supporting research, steps for carrying out the recommendation, and strategies you can use to overcome potential challenges. Many of the guides also feature supplementary materials, like videos and summaries, to help you quickly find what you need.

One example is our most recent practice guide, Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students. Mastering algebra helps students move from arithmetic operations to understanding abstract concepts, and is for a key to success in future mathematics courses, including geometry and calculus. Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students presents three evidence-based recommendations educators can use to help students develop a deeper understanding of algebra, promote process-oriented thinking, and encourage precise communication. These recommendations help address common challenges in algebra instruction and focus on:

  • Utilizing the structure of algebraic representations to make connections among problems, solution strategies, and representations; 
  • Incorporating solved problems into classroom instruction and activities to help students develop their algebraic reasoning skills; and
  • Comparing and selecting from alternative algebraic strategies to give students flexibility when solving problems. 

You can read the Practice Guide Summary for a quick overview of these recommendations or spend a few minutes watching videos in which Jon Star, of Harvard University’s Graduate School of Education, explain the recommendations.  

The Teaching Strategies for Improving Algebra Knowledge in Middle and High School Students is just one of 19 practice guides available on the What Works Clearinghouse website. Some of the others are:

  • Teaching Math to Young Children: Preschool and kindergarten teachers can get details on how to improve math lessons with this guide, including strategies to create a math-rich environment. You’ll find examples of classroom activities and games that can supplement lesson plans and provide opportunities for children to learn math.

You can find information and links to all 19 practice guides on our website. We also cover a variety of other math and literacy topics, as well as guides focused on dropout prevention, using data to monitor student progress and make decisions, and preparing students for college.

Visit whatworks.ed.gov to find the practice guide that’s right for you or to suggest a topic you’d like us to explore.

Dr. McCallum is an education research analyst on the What Works Clearinghouse team.

About the What Works Clearinghouse (WWC)

For more than a decade, the goal of the WWC has been to provide educators with the information they need to make evidence-based decisions with the aim of improving student outcomes. Established by the U.S. Department of Education’s Institute of Education Sciences, the WWC strives to be a central and trusted source of scientific evidence on education programs, products, practices, and policies. Follow us on Twitter and Facebook.

 

Regional Educational Laboratories: Connecting Research to Practice

By Joy Lesnick, Acting Commissioner, NCEE

Welcome to the NCEE Blog! 

Joy Lesnick

We look forward to using this space to provide information and insights about the work of the National Center for Education Evaluation and Regional Assistance (NCEE). A part of the Institute of Education Sciences (IES), NCEE’s primary goal is providing practitioners and policymakers with research-based information they can use to make informed decisions. 

We do this in a variety of ways, including large-scale evaluations of education programs and practices supported by federal funds; independent reviews and syntheses of research on what works in education; and a searchable database of research citations and articles (ERIC) and reference searches from National Library of Education. We will explore more of this work in future blogs, but in this post I’d like to talk about an important part of NCEE—the Regional Educational Laboratories (RELs).

It’s a timely topic. Last week, the U.S. Department of Education released a solicitation for organizations seeking to become REL contractors beginning in 2017 (the five-year contracts for the current RELs will conclude at the end of 2016). The REL program is an important part of the IES infrastructure for bridging education research and practice. Through the RELs, IES seeks to ensure that research does not “sit on a shelf” but rather is broadly shared in ways that are relevant and engaging to policymakers and practitioners. The RELs also involve state and district staff in collaborative research projects focused on pressing problems of practice. An important aspect of the RELs’ work is supporting the use of research in education decision making – a charge that the Every Student Succeeds Act has made even more critical.

The RELs and their staff must be able to navigate comfortably between the two worlds of education research and education practice, and understand the norms and requirements of both.  As part of this navigating, RELs focus on: (1) balancing rigor and relevance; (2) differentiating support to stakeholders based on need; (3) providing information in the short term, and developing evidence over the long term; and (4) addressing local issues that can also benefit the nation.

While the RELs are guided by federal legislation, their work reflects – and responds to – the needs of their communities. Each REL has a governing board comprised of state and local education leaders that sets priorities for REL work. Also, nearly all REL work is conducted in collaboration with research alliances, which are ongoing partnerships in which researchers and regional stakeholders work together over time to use research to address an education problem.  

Since the current round of RELs were awarded in 2012, these labs and their partners have conducted meaningful research resulting in published reports and tools, held hundreds of online and in-person seminars and training events that have been attended by practitioners across the country, and produced videos of their work that you can find on the REL Playlist on the IES YouTube site. Currently, the RELs have more than 100 projects in progress. RELs do work in nearly every topic that is crucial to improving education—kindergarten readiness, parent engagement, discipline, STEM education, college and career readiness, teacher preparation and evaluation, and much more.

IES’s vision is that the 2017–2022 RELs will build on and extend the current priorities of high-quality research, genuine partnership, and effective communication, while also tackling high-leverage education problems.  High-leverage problems are those that: (1) if addressed could result in substantial improvements in education outcomes for many students or for key subgroups of students; (2) are priorities for regional policymakers, particularly at the state level; and (3) require research or research-related support to address well. Focusing on high-leverage problems increases the likelihood that REL support ultimately will contribute to improved student outcomes.

Visit the IES REL website to learn more about the 2012-2017 RELs and how you can connect with the REL that serves your region.  Visit the FedBizOpps website for information about the competition for the 2017-2022 RELs.