Skip Navigation

Standards for Excellence in Education Research

Rigorous education research that is transparent, actionable, and focused on consequential outcomes has the potential to dramatically improve student achievement. Since 2002, the Institute of Education Science has supported rigorous evidence-building about education policy, programs, and practices. IES's SEER Principles are meant to complement the WWC's focus on internal validity by supporting a distinct, IES-wide effort that emphasizes additional factors that can make research transformative.

The SEER Principles were first introduced by IES Director Mark Schneider in two blogs. You can read those blogs here and here. IES has iterated frequently on SEER and expects to continue to do so as we receive feedback from researchers, practitioners, and policymakers. To provide feedback, please contact NCEE.Feedback@ed.gov.

The SEER Principles encourage researchers to:

  • Pre-register studies
    • Did the researcher execute the research and analysis activities as originally proposed in a recognized study registry?
    • Did the researcher describe key elements of the study protocol, including a limited number of primary outcomes plans for their analysis, in their registration?
    • Did the researcher clearly explain any deviations from those plans and offer a reasonable rationale for doing so?
    • Did the researcher report on each of the primary outcomes registered at the study's outset?

  • Make findings, methods, and data open
    • Did the researcher make research publications freely available to others via ERIC?
    • Did the researcher provide access to the final research data of all publications, while protecting the rights and privacy of human subjects at all times?

  • Identify interventions' core components
    • Did the researcher document the core components of an intervention, including its essential practices, structural elements, and the contexts in which it was implemented and tested?
    • Did the researcher offer a clear description of how the core components of an intervention are hypothesized to affect outcomes?
    • Did the researcher's analysis help us understand which components are most important in achieving impact?

  • Document treatment implementation and contrast
    • Did the researcher document the process by which the treatment was implemented, including:
      • change management approaches used to deploy the treatment;
      • how the treatment was—or was not—integrated into related programs and practices; and
      • potential accelerants of, or barriers to, instantiating the treatment by researchers and practitioners?
    • Did the researcher measure the fidelity of an intervention's implementation?
    • Did the researcher document, and identify opportunities to learn from, adaptations of the intervention that were observed during implementation?
    • Did the researcher document the counterfactual condition(s), including measures of essential elements of the treatment contrast between participants in the treatment and control conditions as identified by an intervention's core components?

  • Analyze interventions' costs
    • Did the researcher measure the cost of components of the intervention relative to the control or comparison condition?

  • Focus on meaningful outcomes
    • Did the researcher explore outcomes that are broadly recognized as useful to measure students' learning, opportunities in education, or success from education and appropriate to the goals and audiences of the research?
    • Did the researcher explore variation in outcomes by subpopulations of interest?
    • Did the researcher explore whether treatment effects were observed over time, particularly on distal measures an intervention might be thought to address?

  • Facilitate generalization of study findings
    • Did the researcher, through intentional sampling or other means, design the study to permit ready generalization of its findings to populations of interest?
    • If not done a priori, did the researcher make statistical adjustments to their analytic sample to support generalizing study findings to populations of interest?

  • Support scaling of promising results
    • Did the researcher execute the research in settings and with student populations such that it can inform extending the reach of promising interventions?
    • Did the researcher explore factors that can inform the efficacy and sustainability of the intervention at scale?
    • Did the researcher develop materials that could support the replication and/or scaling of an intervention by others, such as manuals, toolkits, or implementation guides?