Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Developing Methodological Foundatio ...
Home arrow_forward_ios ... arrow_forward_ios Developing Methodological Foundatio ...
Information on IES-Funded Research
Grant Closed

Developing Methodological Foundations for Replication Sciences

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $814,052
Principal investigator: Vivian Wong
Awardee:
University of Virginia
Year: 2019
Project type:
Methodological Innovation
Award number: R305D190043

Purpose

Using the Causal Replication Framework, the project team developed tools and methods to improve the design, implementation, and analysis of replication studies. The project team focused on three aims: (1) develop and improve research designs for evaluating the replicability of causal results, (2) create planning tools and diagnostic measures to improve the implementation of replication designs, and (3) build analytic tools for evaluating replication success from well-implemented designs.

Project Activities

To achieve its aims, the project team –

Key outcomes

Key findings from this project include the following:

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Project Website: https://www.edreplication.org/

Additional Online Resources and Information:

  • Scurlock, A., McCharen, K., Hasan, L., & Congxin, X. (2021). TranscriptSim: Automated NLP Document Similarity. (Version 1.1). https://github.com/congxinxu0116/TranscriptSim
  • Machita, J., Rohrich, T. Jiang, Y. & Zhen, Y. (2021). Designing a Replicable Data Infrastructure for Education Research. https://github.com/taylorrohrich/MSDS_SERA_capstone
  • Erickson, S. (2023). Replication Code for Experimental Evidence on Robustness of Coaching Supports in Teacher Education. https://github.com/steffenerickson/collab_rep_lab/tree/a3a9112cdf3e6b7e9cfbd419150a29b3857b8963/nsf_v2
  • Erickson, S. (2023). Correspondence Test in STATA. https://github.com/steffenerickson/correspondence_test

Select Publications:

Anglin K.L., Wong V.C., Boguslav A. (2021). A natural language processing approach to measuring treatment adherence and consistency using semantic similarity. AERA Open, 7, 23328584211028615.

Boguslav, A., & Cohen, J. (2024). Different Methods for Assessing Preservice Teachers' Instruction: Why Measures Matter. Journal of Teacher Education, 75(2), 168-185.

Cohen, J., Wong, V. C., Krishnamachari, A., & Erickson, S. (2024). Experimental evidence on the robustness of coaching supports in teacher education. Educational Researcher, 53(1), 19-35.

Steiner, P.M., Wong, V.C. & Anglin, K. (2019). A causal replication framework for designing and assessing replication efforts. Zeitschrift fur Psychologie, (226)3.

Steiner, P. M., Sheehan, P., & Wong, V. C. (2023). Correspondence measures for assessing replication success. Psychological Methods. Advance online publication.

Wong V.C., Anglin K., Steiner P.M. (2022). Design-based approaches to causal replication studies. Prevention Science, 23(5), 723-738.

Project website:

https://www.edreplication.org/

Related projects

Iterative Replication of Read Well in First Grade

R324R200014

Developing Infrastructure and Procedures for the Special Education Research Accelerator

R324U190001

Special Education Research Accelerator Phase 2: Identifying Generalization Boundaries

R324U230001

Special Education Research Accelerator Phase 2: Identifying Generalization Boundaries

R324U230001

Integrated Replication Designs for Identifying Generalizability Boundaries of Causal Effects

R305D220034

Supplemental information

Co-Principal Investigator(s): Steiner, Peter

  • Demonstrated how multiple research designs for replication can be combined to identify systematic sources of effect variation
  • Examined methods and approaches for assessing replication design assumptions and developed approaches for how researchers may demonstrate the extent to which their studies meet replication design assumptions (using balance tables) and have treatment stability across studies replicating a treatment
  • Examined the most used statistical methods for assessing replication success and compared their performance in different contexts and introduced a new measure for assessing replication success (the correspondence test)
  • Provided closed formulas for power calculations that can be used to determine the minimum detectable effect size (and sample sizes) for each study to help researchers plan prospective replication efforts
  • Many existing replication efforts yield uninterpretable results. This is due to the fact that multiple replication assumptions are violated simultaneously and that most replication efforts underpowered for detecting replication success (Steiner, Wong, & Anglin 2019; Wong, Anglin, & Steiner, 2021; Steiner, Sheehan, & Wong, 2023).
  • Systematic replication designs can be used to identify sources of effect heterogeneity (Steiner, Wong, & Anglin 2019; Wong, Anglin, & Steiner, 2021). These approaches are ethical, feasible, and desirable in field settings.
  • Evaluating replication success requires well-defined metrics for assessing replication success. These metrics should be defined in advance through pre-registration and have different statistical properties and power requirements for achieving replication success. The correspondence test provides a unified approach for assessing the statistical test of difference and equivalence of results in the same framework. Most existing replication efforts are underpowered for detecting the replicability of results (Steiner, Sheehan, & Wong, 2023).

Statistical/Methodological Product: The project team introduced methodological theory (the Causal Replication Framework); case study examples; code for designing, implementing, and analyzing systematic replication designs; and metrics for assessing replication success (including the correspondence test).

Development/Refinement Process: The project team's primary methodologies for refining the replication designs were through case study approaches to design, conduct, and analyze systematic replication studies, and simulation studies to examine the statistical properties for different metrics for assessing replication success.

User Testing: Substantive researchers in the fields of teacher preparation, reading, and special education worked in collaboration with the project team to design the studies that would be used in the replication efforts, to carry and conduct the studies, and to interpret results with the research team.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

MathematicsData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote