Rigorous education research that is transparent, actionable, and focused on consequential outcomes has the potential to dramatically improve student achievement.
Since 2002, IES has supported rigorous evidence-building about education policy, programs, and practices. The Standards for Excellence in Education Research (SEER) complements the WWC's focus on internal validity by supporting a distinct, IES-wide effort that emphasizes additional factors that can make research transformational.
SEER encourages researchers to:
- Pre-register studies
- Make findings, methods, and data open
- Address inequities in learners' opportunities, access to resources, and outcomes
- Identify interventions' components
- Document treatment implementation and contrast
- Analyze interventions' costs
- Use high-quality outcome measures
- Facilitate generalization of study findings
- Support scaling of promising interventions
Pre-register studies
SEER Standards
- Causal impact studies must be pre-registered in a recognized study registry, documenting their confirmatory research questions and planned analytic activities.
- Researchers should execute research and analysis activities as proposed in their original study registration.
- When deviations from pre-registered plans occur, researchers must update their registry entries and provide an explanation for why a change took place.
Resources
There are several options for preregistration including the Registry of Efficacy and Effectiveness Studies (REES), the Open Science Framework (OSF), ClinicalTrials.gov, AEA Registry, EGAP, AsPredicted, and trial registries in the WHO Registry Network.
Make findings, methods, and data open
SEER Standards
- Researchers must submit the electronic versions of final manuscripts to an archive site that will make them freely available to others.
- Note: Consistent with the IES Public Access to Research Policy, IES grantees and contractors must use ERIC for this purpose.
- Researchers must provide access to the final research data arising from their activities, while protecting the rights of study participants and the confidentiality of the data in a manner consistent with the requirements of the responsible institutional review board (IRB) as well as applicable state and Federal laws and regulations.
- Note: Consistent with the IES Public Access to Research Policy, grantees and contractors testing the causal impacts of policies, practices, and/or interventions are required to develop a data management plan (DMP). DMPs must specify how final research data will be shared.
Resources
- Read the NCEE Publication Sharing Study Data: A Guide for Education Researchers, by Neild, Robinson, and Agufa (2022)
- View the archived webinar on Sharing Study Data: A Guide for Education Researchers
- Learn more about IES's Public Access Policy
- Learn more about the federal-wide mandate to provide open-access to federal science information at science.gov
- Access the ERIC Grantee and Online Submission System
- Access ERIC FAQs for complying with the Public Access Policy
- Read the IES Implementation Guide for Public Access to Research Data and its related FAQ
- Watch recorded sessions from previous IES Principal Investigators' Meetings relevant to this topic.
Equity
SEER Standard
- Researchers who are designing and testing interventions must clearly demonstrate how those interventions address education inequities, such as by improving learners' outcomes and/or their access to resources and opportunities.
Recommendation
- Researchers should discuss how their study conceptualizes education equity, and how the study's design, sample, measurement, analysis, and reporting align to that conceptualization.
- When feasible, researchers should design studies that allow valid estimates to be calculated for different groups within the sample to improve our understanding of the extent to which policies, practices, and interventions yield varying outcomes for different groups, especially those groups that have been historically underserved.
- Researchers should design interventions that take into account the contexts and systems in which they will be implemented.
- Researchers should describe how they will consider input from learners, educators, and/or other key stakeholders when conceptualizing, designing, and reporting the results of their research, and when considering issues critical for implementation and scaling of interventions
Resources
- Read Best Practices and Challenges for Embedding Equity in Education Research Technical Working Group Summary (November/December 2022)
What We're Reading and Watching
Cerna, O., Conliffe, B., & Wilson, A. (2021). Guiding questions for supporting culturally responsive evaluation practices and an equity-based perspective. New York, NY: MDRC. Retrieved from
Woodson, T. T. (2021). Using a culturally responsive and equitable evaluation approach to guide research and evaluation. Princeton, NJ: Mathematica. Retrieved from https://www.mathematica.org/publications/using-a-culturally-responsive-and-equitable-evaluation-approach-to-guide-research-and-evaluation
Visit the NCER/NCSER blog "The 2022 IES PI Meeting: Advancing Equity & Inclusion in the Education Sciences," which includes meeting highlights and links to several presentations of interest.
Identify interventions' components
SEER Standards
- Researchers must document the components of an intervention, including its essential practices and structural elements.
- Researchers must offer a clear description of how the components of an intervention are hypothesized to affect outcomes.
Recommendation
- Researchers' analyses should explore which components are most important in achieving impact—that is, which components are core to the intervention's efficacy.
Resources
Read Developing a Core Components Nomenclature in Education: An Update on IES-funded work (see link below) to learn about IES's initial efforts to develop a common framework from which to articulate the components of interventions in various topic areas.
What We're Reading
Blase, K. & Fixsen, D. (2013). Core intervention components: Identifying and operationalizing what makes programs work. ASPE Research Brief. Washington, DC: Office of the Assistant Secretary for Planning and Evaluation, Office of Human Services Policy, U.S. Department of Health and Human Services. Retrieved from https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work
Ferber, T., Wiggins, M. E., & Sileo, A. (2019). Advancing the use of core components of effective programs. Washington, DC: The Forum for Youth Investment. Retrieved from https://forumfyi.org/knowledge-center/advancing-core-components/
Document treatment implementation and contrast
SEER Standards
- Researchers must document how, and the context within which, the treatment was implemented.
- Researchers must document the counterfactual condition(s), including its context.
- Researchers must measure the essential elements of the treatment contrast between the treatment and control conditions.
- Researchers must measure the fidelity of an intervention's implementation.
Recommendation
- Researchers should document, and identify opportunities to learn from, adaptations of the intervention that were observed during implementation.
Resources
- Read the NCEE Publication Conducting Implementation Research in Impact Studies of Education Interventions: A Guide for Researchers, by Hill, Scher, Haimson, and Granito (2023).
- View the archived webinar on Conducting Implementation Research in Impact Studies of Education Interventions: A Guide for Researchers
- View the archived webinar on Prioritizing and Selecting Context Features in Education Impact Studies: A 4R Lenses Approach, and the webinar slides.
What We're Reading
Hamilton, G. & Scrivener, S. (2018). Measuring treatment contrast in randomized controlled trials. MDRC Working Paper. Retrieved from:
Lemons, C. J., Fuchs, D., Gilbert, J. K., & Fuchs, L. S. (2014). Evidence-based practices in a changing world: Reconsidering the counterfactual in education research. Educational Researcher, 43(5), 242–252. https://doi.org/10.3102%2F0013189X14539189
Weiss, M., Bloom H. S., & Brock, T. (2014). A conceptual framework for studying the sources of variation in program effects. Journal of Policy Analysis and Management, 33(3), 778-808. https://doi.org/10.1002/pam.21760
Analyze interventions' costs
SEER Standards
- Researchers must document the type and quantity of resources (e.g., personnel, materials and equipment, facilities, and other inputs) required to implement the interventions they study and document the economic cost of those resources.
- When cost analyses are conducted, researchers must:
- identify the perspective from which the cost analysis was conducted (e.g., societal perspective, district perspective) and provide the reason for taking this perspective;
- clearly state all the assumptions used in estimating costs and offer a justification for those assumptions,
- describe any adjustments made to prices to amortize costs or to account for inflation, regional price differences, and the time value of money and explain a rationale for those adjustments; and
- present cost metrics and cost breakdowns that help education decisionmakers understand the resource requirements and costs of implementing interventions.
Recommendation
- When feasible, researchers conducting a cost analysis should provide a "reference case analysis" in which they adopt a societal perspective (i.e., include costs to all stakeholders), use national average prices, and use a 3 percent discount rate to calculate present values.
- Even if not otherwise required, researchers should consider measuring the cost of components of the intervention relative to the control or comparison condition.
Resources
- Download the IES Cost Analysis Starter Kit, version 1.0 (released April 1, 2020).
- Cost Analysis: A Starter Kit (663.6 KB)
- Visit the IES-funded Cost Analysis in Practice (CAP) Project, its Resource Page, and the Online Modules.
- Read the American Institutes of Research's Standards for the Economic Evaluation of Education and Social Programs
- Visit the IES-funded E$timator tool, hosted by Teachers College, Columbia University.
- Read the REL Publication The Critical Importance of Costs for Education Decisions, by Hollands and Levin (2017).
- Read IES blogs related to cost analysis.
What We're Reading
Hollands, F.M., Kieffer, M.J., Shand, R., Pan, Y., Cheng, H, & Levin, H.M. (2016). Cost-effectiveness analysis of early reading programs: A demonstration with recommendations for future research. Journal of Research on Educational Effectiveness, 9, 30–53. https://doi.org/10.1080/19345747.2015.1055639
Levin, H.M., & Belfield, C. (2015). Guiding the development and use of cost-effectiveness analysis in education. Journal of Research on Educational Effectiveness, 8, 400–418. https://doi.org/10.1080/19345747.2014.915604
High-quality outcome measures
SEER Standards
- Researchers must clearly define the outcome constructs of interest to their study, selecting measures that are valid and reliable assessments of those constructs and are otherwise psychometrically sound.
- Researchers must use outcome measures that are appropriate to the context of the intervention and the research participants.
- Researchers must avoid outcome measures that are overaligned to the intervention being studied. (For more information on overalignment, consult the What Works Clearinghouse Standards Handbook.)
- Researchers must include among their measures those that have practical significance to educators, parents, or other decision-makers, when such measures exist.
- Researchers must examine both the immediate impact of their intervention on outcomes of interest as well as its impact on relevant distal outcomes and the potential that initial impacts may fade over time.
Recommendation
- When feasible, researchers should pose, and design studies capable of developing rigorous answers to, questions about the impact of policies, practices, and interventions for different groups and subgroups of learners.
Resources
- Read the NCEE Publication The BASIE (BAyeSian Interpretation of Estimates) Framework for Interpreting Findings from Impact Evaluations: A Practical Guide for Education Researchers, by Deke, Finucane, and Thal (2022).
- View the archived webinar on The BASIE (BAyeSian Interpretation of Estimates) Framework for Interpreting Findings from Impact Evaluations.
- Visit EdInstruments, hosted by the Annenberg Institute at Brown University.
- Visit the Education Assessment Finder, hosted by The RAND Corporation.
- Visit the NIH Toolbox, hosted by Health Measures.
What We're Reading
IES Director Mark Schneider's May, 2020 blog "Making Common Measures More Common."
Facilitate generalization of study findings
SEER Standards
- Researchers must, through intentional sampling or other means, design and analyze studies to permit ready generalization of its findings to populations of interest.
- Researchers must document and report the baseline characteristics of their analytic sample.
Resources
- Read the NCEE Publication Practical Strategies for Recruiting Districts and Schools for Education Impact Studies, by Robinson, Rosenberg, Espinoza, and Zeidman (2023).
- Read the NCEE Publication Enhancing the Generalizability of Impact Studies in Education, by Tipton and Olsen (2022).
- View the archived webinar on Enhancing the Generalizability of Impact Studies in Education
- Visit The Generalizer, hosted by Teachers College, Columbia University.
What We're Reading
Stuart, E. A., Bradshaw, C. P., & Leaf, P. J. (2014). Assessing the generalizability of randomized trial results to target populations. Prevention Science, 16, 475–485. https://doi.org/10.1007/s11121-014-0513-z
Tipton, E. & Olsen, R. B. (2018). A review of statistical methods for generalizing from evaluations of educational interventions. Educational Researcher, 47, 516–524. https://doi.org/10.3102/0013189x18781522
Support scaling of promising interventions
SEER Standards
- Researchers must carry out their research in settings and with student populations such that it can inform extending the reach of promising interventions.
Recommendations
- Researchers should explore factors associated with the intervention and its implementation that can inform the efficacy and sustainability of the intervention at scale, such as its affordability and feasibility.
- Researchers should develop materials that could support the replication and/or scaling of an intervention by others, such as manuals, toolkits, or implementation guides.
Resources
- Read From Research to Market: Development of a Transition Process to Integrate Sustainable Scaling Methodologies into Education Innovation Research Design and Development from SRI International.
What We're Reading
Coburn, C. E. (2003). Rethinking scale: Moving beyond numbers to deep and lasting change. Educational Researcher, 32, 3–12. https://doi.org/10.3102/0013189x032006003
Krainer, K., Zehetmeir, S., Hanfstingl, Rauch, F., & Tscheinig, T. (2019). Insights into scaling up a nationwide learning and teaching initiative on various levels. Educational Studies in Mathematics, 102, 395–415. https://doi.org/10.1007/s10649-018-9826-3
McDonald, S., Keesler, V. A., Kauffman, N. J., & Schneider, B. (2006). Scaling-up exemplary interventions. Educational Researcher, 35, 15–24. https://doi.org/10.3102/0013189x035003015
Omar Al-Ubaydli, John A. List, and Dana Suskind. 2019. The Science of Using Science: Towards an Understanding of the Threats to Scaling Experiments NBER Working Paper No. 25848 May 2019. https://tmwcenter.uchicago.edu/wp-content/uploads/2017/10/The-Science-of-Using-Science-2019-Al-Ubaydli-List-Suskind.pdf