IES Blog

Institute of Education Sciences

Companion Guidelines on Replication and Reproducibility in Education Research

Just over five years ago the Institute of Education Sciences (IES) and the National Science Foundation (NSF) released the Common Guidelines for Education Research and Development. The Guidelines provided the expected purposes, justifications, and contributions of various types of research aimed at improving our knowledge of interventions and strategies for improving teaching and learning.  Since 2013, there has been increased attention to replication and reproducibility studies and their role in building the evidence base. In response to this interest and the importance of this work, the two organizations jointly issued the new Companion Guidelines on Replication and Reproducibility in Education Research to supplement the Common Guidelines for Education Research and Development. The companion document provides guidance on the steps researchers can take to promote corroboration, ensure the integrity of research, and extend the evidence base.

The Companion Guidelines identify principles to help education stakeholders design and report reproducibility and replication studies. These principles are consistent with and draw from guidelines provided by scientific and professional organizations, advisory committees, and have emerged in consultation with the field (e.g., Dettmer, Taylor, and Chhin, 2017; Subcommittee on Replicability and Science, 2015). The principles address three main areas – (1) replication and reproducibility at the proposal stage, (2) promoting transparency and openness in designing studies, and (3) considerations in the reporting of results. 

Although the importance of reproducibility and replication studies for advancing scientific knowledge has been widely acknowledged, there are several challenges for researchers in our field, including actual or perceived disincentives (e.g., publication bias; reputation and career advancement norms; emphases on novel, potentially transformative lines of inquiry), implementation difficulties (especially for direct replications), and complexities of interpreting results (e.g., lack of consensus on what it means to “replicate” findings, low statistical power for replications). Grant funding agencies such as IES and NSF as well as education researchers have a role to play in addressing these challenges, promoting reproducibility and replication studies, and ultimately moving the field forward.

Why focus on replication and reproducibility?

The original Common Guidelines document did not substantively address issues pertaining to replication and reproducibility of research.  Given the interest in and importance of this work, IES and NSF are providing additional clarity to the field in terms of common definitions and principles around replication and reproducibility.

Who is the audience for the Companion Guidelines on Replication and Reproducibility? 

The primary audience for this document is education researchers; however, education research funding agencies and reviewers of grant applications are additional audiences for this document.

How should this document be used by researchers intending to apply for grants to conduct a reproducibility or replication study?

This document is meant to highlight the importance of replication and reproducibility studies and to offer guidelines to education stakeholders for thinking about and promoting reproducibility and replication in education research. It does not supersede the guidance provided in the requests for applications provided by IES and NSF. 

What are the guiding principles for proposing replication and reproducibility studies?

The overarching principles at the grant proposal stage are as follows:

  1. Clarify how reproducibility or replication studies would build on prior studies and contribute to the knowledge base.
  2. Clearly specify any variations from prior studies and the rationale for such variations.
  3. Ensure objectivity (e.g., by conducting an independent investigation, or by putting safeguards in place if the original investigator(s) is involved).

In addition to these principles, the document also lays out principles for promoting transparency, open science, and reporting results.

Read the full Companion Guidelines here.

IES logoNational Science Foundation logo

 

Basic Science of Learning and Development Within Education: The IES Investment

I came to the Institute of Education Sciences (IES) in 2002 to build connections between education and the basic science of learning and development. The weak links between these two fields were surprising to me, given how foundational such science is to the very purpose of education.

IES had just launched the Cognition and Student Learning program[1], and researchers were invited to submit applications to examine whether principles of learning established in basic science were robust when examined in education settings.  Six years later, we launched the Social and Behavioral Context to Support Academic Learning to understand the ways in which the social environment of classrooms and school affected learning. Together, IES has invested over $445M, an investment that has contributed substantially to our foundational knowledge of teaching and learning.

I was surprised by this recent blog by Bob Pianta and Tara Hofkens. While they acknowledge the research that IES has supported to transform education practice, they did not seem to realize our substantial, ongoing investments in the basic science of teaching and learning—both in and out of classrooms.

In part, this may reflect their perception of what types of work we support under our Exploration goal – which is not limited to “scouring databases” but instead involves all types of research, including small-scale experiments and longitudinal studies. These projects generate foundational knowledge about what factors are associated with learning outcomes and can potentially be changed through education. In fact, the questions that Pianta and Hofkens want answered by the basic science of education are the same questions that some IES grantees have been examining over the course of the last 15 years. 

Here are just a few examples.

  1. What factors regulate children's attention in a classroom setting? Anna Fisher and her team found that cluttered classroom walls in kindergarten led to greater distraction and less learning – a finding that captured the imagination of the nation and the nation’s educators.
  2. What roles do the capabilities of peers play in advancing children's cognitive capabilities? A new study led by Adrienne Nishina is examining how student’s ability to think about situations from different perspectives is related to their day-to-day interactions with peers from diverse backgrounds.
  3. What factors promote or inhibit teachers' responses to children's perceived misbehavior? Teachers’ expertise and teachers’ emotional competencies are two factors that IES-funded researchers have found to relate to their responses to children’s behavior.
  4. What role do social and emotional experiences and affective processes play in fostering learning? Shannon Suldo and her team find that the coping strategies that high school students choose to manage their responses to stressors are linked to learning outcomes.
  5. What are the components of school climate that matter the most for different forms of student success? Two recent projects, one in Cleveland, and one in Virginia, are using survey data to explore the relationship between school climate, social behavioral competencies and academic outcomes. The teams are also exploring how those relationships vary within student subgroups.

Funding the basic science of teaching and learning—in and out of classrooms—has been and will continue to be a cornerstone of the work that IES funds. The IES investment in this area is broad, and is shared in books such as Make It Stick: The Science of Successful Learning, Becoming Brilliant: What Science Tells us About Raising Successful Children, and Educator Stress: An Occupational Health Perspective

Importantly, IES is not the only funder in this area. The National Science Foundation invests substantially in their Science of Learning portfolio, the McDonnell Foundation’s Understanding Human Cognition portfolio includes an explicit request for projects at the intersection of cognition and education, and the Child Development and Behavior Branch of the Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) supports a variety of relevant research programs.  I agree that we need systematic investment in the basic science of teaching and learning. But we must build on what we have already learned.  

We are grateful that Pianta and Hofkens recognize the importance of investing in this area. Perhaps the fact that they did not acknowledge the substantial investments and contributions IES has made in exploring the important questions they pose is an IES problem. While we have invested heavily in the science of learning, we have skimped on brand development and self-promoting. If someone as central to the field such as Pianta, who has received several IES grants, including research training grants, doesn’t know what IES has done, that is a red flag that we will need to attend to.

In the meantime, we hope that this brief glimpse into our investment to date has illustrated some of the questions that the basic science of teaching and learning within education can answer. More importantly here’s where you can seek funding for this type of work.

Elizabeth Albro

Commissioner, National Center for Education Research

 

[1] IES was authorized in November 2002. The Cognition and Student Learning research program was launched by the Office of Educational Research and Improvement, the office from which IES was created.

Sharing strategies to increase research-based educational practices

By Cora Goldston, REL Midwest

Highlighted Resources

How can states, districts, and schools identify effective practices to address challenges and achieve their goals? Education research can point the way, but sometimes finding and accessing relevant research can be a frustrating and time-consuming process. And even when practitioners can find research, it can be difficult to determine a study’s rigor and the strength of research evidence supporting interventions.

Equipping practitioners to use research evidence

Through the Midwest Alliance to Improve Knowledge Utilization (MAIKU), the Regional Educational Laboratory (REL) Midwest is partnering with practitioners to help states, districts, and schools use research to inform practice. The goal is to make it easier for educators to find research relevant to their priorities, assess the level of evidence that supports potential practices, and implement those practices that are based on strong evidence.

REL Midwest and MAIKU are supporting the use of research in education practice in several ways. For example, REL Midwest provided coaching sessions for the Ohio Department of Education (ODE) on understanding the Every Student Succeeds Act (ESSA) tiers of evidence. In addition, REL Midwest created a crosswalk that shows how the ESSA evidence tiers align with ratings from research clearinghouses, such as the What Works Clearinghouse. In turn, ODE is using this information to help Ohio districts that are applying for Striving Readers grants. To receive the grants, districts must demonstrate that they plan to use research-based practices to improve student literacy. As a result of REL Midwest’s support, ODE has strengthened its capacity to help districts determine the level of evidence supporting certain practices and, thus, to submit stronger grant applications.

REL Midwest is providing similar support across the region. In Michigan, we are conducting coaching sessions for the state Department of Education to help agency leadership choose priorities from the state’s Top 10 in 10 plan, identify research-based practices that support those priorities, and collaborate to implement new state-level practices. In Wisconsin, REL Midwest hosted a training series for the Department of Public Instruction to increase the agency’s capacity to collect, analyze, and use data to adjust state-level policies and practices. And in Illinois, REL Midwest is holding a training series for the State Board of Education on research methods, data collection, and data analysis and how to use the findings to inform agency practices.

June webinar on increasing evidence use

MAIKU is also working with researchers to support evidence use in education practice. On June 19, 2018, REL Midwest and MAIKU hosted a webinar to discuss how researchers can share evidence with practitioners in useful and accessible ways.

The webinar featured a presentation by Alan J. Daly, Ph.D., of the University of California at San Diego, and Kara Finnigan, Ph.D., of the University of Rochester. Dr. Daly and Dr. Finnigan discussed how information-sharing networks are structured among school and district staff and the challenges for practitioners in accessing and using research-based practices.   

Building on this context, Dr. Daly and Dr. Finnigan shared insights about the most effective ways to maximize the reach of research. One of their key findings is that the pattern of people’s social ties makes a difference for sharing and using research-based practices. Finnigan and Daly noted that the set of relationships we have can increase access to research evidence if the right ties are present but can constrain access to resources when those ties are not present. The quality of relationships also matters; high levels of trust are essential for more in-depth exchanges of information. The takeaway: fostering both the quantity and quality of social relationships is important for sharing research evidence.  

During the webinar, Jaime Singer, senior technical assistance consultant at REL Midwest, also shared actionable strategies that researchers can use to support evidence use in practice, including training and coaching sessions, checklists, blog posts, and clearinghouses of effective practices.

The webinar included a panel discussion about REL Midwest’s ESSA evidence tiers coaching sessions and crosswalk for ODE. REL Midwest researcher Lyzz Davis, Ph.D., provided a researcher perspective on developing resources to meet ODE’s needs. Heather Boughton, Ph.D., and Melissa Weber-Mayrer, Ph.D., at ODE provided practitioner perspectives on how REL Midwest’s work has strengthened the agency’s capacity to help districts find and use evidence-based interventions.

Introducing the NCES Ed Tech Equity Initiative

The 21st century American classroom continues to evolve, particularly through the incorporation of technology into K-12 learning. In response to these changes, the National Center for Education Statistics (NCES) consistently works to ensure our data collections include information on how these changes affect U.S. education.

Technology is changing how teachers teach, as well as what, how, and where students learn. As a tool, technology has the potential to improve our education system by creating more equitable circumstances for all. However, while technology has assisted in improving educational experiences and outcomes for some, inequities persist. That’s why the NCES Ed Tech Equity Initiative was created—to better inform the condition of American education by giving greater focus to the relationship technology has with K-12 students’ education.

Within the framework, we define technology as digital resources (e.g., internet, phones, laptops, tablets, and software). Ed Tech Equity, or education technology and equity, refers to fairness regarding the relationship of technology and students’ educational experiences and outcomes.

THE FRAMEWORK

The Ed Tech Equity Framework serves as the conceptual anchor for the Initiative—it captures the most critical factors that influence ed tech equity as it relates to K-12 education. The framework was created following extensive research and feedback. NCES reviewed existing NCES data collections and reports, as well as relevant research external to NCES. Additionally, we consulted NCES staff and stakeholders, including teachers, principals, and researchers. Stay tuned for a more in-depth look at the framework in our next blog post.

EXISTING TECHNOLOGY-RELATED EFFORTS

Another critical step in advancing this work included completing a comprehensive internal review of NCES’ current tech-related efforts to understand what tech-related items are already collected, reported, and disseminated. Through this review, we found that a number of NCES surveys collect and report tech-related information. However, there is room for NCES to improve upon these existing efforts. As one of the first steps in this direction, NCES convened a panel of experts to share their insights and recommendations for ed tech equity data collection, reporting, and dissemination.

OUR VISION

It is important that NCES remains agile in its pursuit of comprehensive and timely data on condition of education across the country. Through this Initiative, we intend to provide researchers, policymakers, educators, parents, and students with user-friendly data that informs the relationship between technology and K-12 education.

While we’ve accomplished a great deal thus far, we’re excited to continue to advance this Initiative and to share our results!

 

By Halima Adenegan, NCES and Emily Martin, Hager Sharp

Looking for evidence outside of the scope of the WWC?

by Chris Weiss and Erin Pollard, What Works Clearinghouse

The What Works Clearinghouse (WWC) strives to be a central and trusted source of research evidence for what works in education. But did you know that the WWC is one of several repositories of evidence produced by the federal government? Our mission at the WWC is to review the existing research on different programs, products, practices, and policies in education to provide educators with the information they need to make evidence-based decisions. However, there are several other government repositories that review evidence on interventions that impact children and schools, reviews that may be of use and interest to WWC users.

 

Different Clearinghouses for Different Needs.

The mission of the different clearinghouses and the reasons for different reviews stems from the unique mission of each agency and the specific focus of the clearinghouse. The Department of Education focuses primarily on prekindergarten through postsecondary education; however, many public health and crime prevention programs are implemented through schools. So, for example, you would find information about a school-based bullying prevention program on the National Institute of Justice’s Crime Solutions website. The WWC would not review the evidence of this program’s effectiveness because its aim is to reduce bullying and victimization, rather than education-focused outcomes.

 

Some interventions are reviewed by multiple clearinghouses.

Users are often surprised that an intervention might be reviewed by multiple clearinghouses. For example, the WWC reviewed the evidence and created an intervention report on Career Academies, a school-within-school program where students take both career-related and academic courses, as well as acquire work experience. But reviews of the program are included in other clearinghouses. The Department of Labor’s CLEAR reviewed the study because of the intervention’s increase of student’s earnings. Similarly, the National Institute of Justice’s Crime Solutions has reviewed the intervention because it showed an effect on increasing earnings of young men – an economic factor linked to lowered risk of criminal activity. Each clearinghouse looked at different outcomes from the same study to highlight the domains they find most relevant to achieving their goal.

 

Each repository is different. The WWC may be your best bet – or others may fit your needs better.

We encourage users to look at the other clearinghouses to find information on outcomes that are outside of our scope. These sites have a lot of great information to offer. Here is a list of the other repositories for finding evidence:

  • Clearinghouse for Labor Evaluation and Research (CLEAR) – Department of Labor. CLEAR's mission is to make research on labor topics more accessible to practitioners, policymakers, researchers, and the public more broadly so that it can inform their decisions about labor policies and programs. CLEAR identifies and summarizes many types of research, including descriptive statistical studies and outcome analyses, implementation, and causal impact studies.
  • Compendium of Evidence-Based Interventions and Best Practices for HIV Prevention - Centers for Disease Control and Prevention. Evidence-Based Interventions and Best Practices in the Compendium are identified by CDC’s Prevention Research Synthesis Project through a series of ongoing systematic reviews. Each eligible intervention is evaluated against explicit a priori criteria and has shown sufficient evidence that the intervention works. Interventions may fall into one or more chapters including: Risk Reduction that includes PrEP-related outcomes and outcomes such as injection drug use, condom use, HIV/STD/Hepatitis infection; Linkage to, Retention in, and Re-engagement in HIV Care that includes outcomes such as entering and staying in HIV care; Medication Adherence that includes outcomes such as adhering to HIV medication and HIV viral load; and the most recently added Structural Interventions that includes outcomes such as HIV testing, social determinants of health, and stigma. Information sheets are available for all identified evidence-based interventions and best practices on the PRS Compendium Website.
  • CrimeSolutions - National Institute of Justice, Department of Justice. The clearinghouse, accessible via the CrimeSolutions.gov website, present programs and practices that have undergone rigorous evaluations and meta-analyses. The site assesses the strength of the evidence about whether these programs achieve criminal justice, juvenile justice, and crime victim services outcomes in order to inform practitioners and policy makers about what works, what doesn't, and what's promising.
  • Evidence Exchange - Corporation for National and Community Service. A digital repository of sponsored research, evaluation reports, and data. These resources focus on national service, volunteering, and civic engagement.
  • Home Visiting Evidence of Effectiveness (HomVEE) – Administration for Children and Families, Department of Health and Human Services. HomVEE provides an assessment of the evidence of effectiveness for home visiting models that target families with pregnant women and children from birth to kindergarten entry (that is, up through age 5).
  • Teen Pregnancy Prevention (TPP) Evidence Review – Department of Health and Human Services. A transparent systematic review of the teen pregnancy prevention literature to identify programs with evidence of effectiveness in reducing teen pregnancy, sexually transmitted infections, and associated sexual risk behaviors.
  • The Community Guide - Community Preventive Services Task Force (CPSTF). A collection of evidence-based findings to help you select interventions to improve health and prevent disease in your state, community, community organization, business, healthcare organization, or school. The CPSTF issues findings based on systematic reviews of effectiveness and economic evidence that are conducted with a methodology developed by the CPSTF.
  • youth.gov – Interagency. The youth.gov Program Directory features evidence-based programs whose purpose is to prevent and/or reduce delinquency or other problem behaviors in young people.