Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Developing Methodological Foundations for Replication Sciences
Center: NCER Year: 2019
Principal Investigator: Wong, Vivian Awardee: University of Virginia
Program: Statistical and Research Methodology in Education      [Program Details]
Award Period: 3 Years (09/01/19 – 08/31/22) Award Amount: $814,052
Type: Methodological Innovation Award Number: R305D190043
Description:

Co-Principal Investigator(s): Steiner, Peter

Purpose: Using the Causal Replication Framework, the project team developed tools and methods to improve the design, implementation, and analysis of replication studies. The project team focused on three aims: (1) develop and improve research designs for evaluating the replicability of causal results, (2) create planning tools and diagnostic measures to improve the implementation of replication designs, and (3) build analytic tools for evaluating replication success from well-implemented designs.

Project Activities: To achieve its aims, the project team –

  • Demonstrated how multiple research designs for replication can be combined to identify systematic sources of effect variation
  • Examined methods and approaches for assessing replication design assumptions and developed approaches for how researchers may demonstrate the extent to which their studies meet replication design assumptions (using balance tables) and have treatment stability across studies replicating a treatment
  • Examined the most used statistical methods for assessing replication success and compared their performance in different contexts and introduced a new measure for assessing replication success (the correspondence test)
  • Provided closed formulas for power calculations that can be used to determine the minimum detectable effect size (and sample sizes) for each study to help researchers plan prospective replication efforts

Key Outcomes: Key findings from this project include the following:

  • Many existing replication efforts yield uninterpretable results. This is due to the fact that multiple replication assumptions are violated simultaneously and that most replication efforts underpowered for detecting replication success (Steiner, Wong, & Anglin 2019; Wong, Anglin, & Steiner, 2021; Steiner, Sheehan, & Wong, 2023).
  • Systematic replication designs can be used to identify sources of effect heterogeneity (Steiner, Wong, & Anglin 2019; Wong, Anglin, & Steiner, 2021). These approaches are ethical, feasible, and desirable in field settings.
  • Evaluating replication success requires well-defined metrics for assessing replication success. These metrics should be defined in advance through pre-registration and have different statistical properties and power requirements for achieving replication success. The correspondence test provides a unified approach for assessing the statistical test of difference and equivalence of results in the same framework. Most existing replication efforts are underpowered for detecting the replicability of results (Steiner, Sheehan, & Wong, 2023).

Structured Abstract

Statistical/Methodological Product: The project team introduced methodological theory (the Causal Replication Framework); case study examples; code for designing, implementing, and analyzing systematic replication designs; and metrics for assessing replication success (including the correspondence test).

Development/Refinement Process: The project team's primary methodologies for refining the replication designs were through case study approaches to design, conduct, and analyze systematic replication studies, and simulation studies to examine the statistical properties for different metrics for assessing replication success.

User Testing: Substantive researchers in the fields of teacher preparation, reading, and special education worked in collaboration with the project team to design the studies that would be used in the replication efforts, to carry and conduct the studies, and to interpret results with the research team.

Related IES Projects: Iterative Replication of Read Well in First Grade (R324R200014), Developing Infrastructure and Procedures for the Special Education Research Accelerator(R324U190001), Special Education Research Accelerator Phase 2: Identifying Generalization Boundaries (R324U230001), Integrated Replication Designs for Identifying Generalizability Boundaries of Causal Effects (R305D220034)

Products and Publications

ERIC Citations: Find available citations in ERIC for this award here.

Project Website: https://www.edreplication.org/

Additional Online Resources and Information:

Select Publications:

Anglin K.L., Wong V.C., Boguslav A. (2021). A natural language processing approach to measuring treatment adherence and consistency using semantic similarity. AERA Open, 7, 23328584211028615.

Boguslav, A., & Cohen, J. (2024). Different Methods for Assessing Preservice Teachers' Instruction: Why Measures Matter. Journal of Teacher Education, 75(2), 168–185.

Cohen, J., Wong, V. C., Krishnamachari, A., & Erickson, S. (2024). Experimental evidence on the robustness of coaching supports in teacher education. Educational Researcher, 53(1), 19–35.

Steiner, P.M., Wong, V.C. & Anglin, K. (2019). A causal replication framework for designing and assessing replication efforts. Zeitschrift fur Psychologie, (226)3.

Steiner, P. M., Sheehan, P., & Wong, V. C. (2023). Correspondence measures for assessing replication success. Psychological Methods. Advance online publication.

Wong V.C., Anglin K., Steiner P.M. (2022). Design-based approaches to causal replication studies. Prevention Science, 23(5), 723–738.


Back