|Title:||Developing Methodological Foundations for Replication Sciences|
|Principal Investigator:||Wong, Vivian||Awardee:||University of Virginia|
|Program:||Statistical and Research Methodology in Education [Program Details]|
|Award Period:||3 Years (09/01/19–08/31/22)||Award Amount:||$814,052|
|Type:||Methodological Innovation||Award Number:||R305D190043|
Co-Principal Investigator: Steiner, Peter M.
Purpose: This project will develop methodological foundations for a "replication science." The research team will examine replication through a Causal Replication Framework, which formalizes conditions under which replication success can be expected. Using potential outcomes notation, "replication" is defined as a research design that tests whether two (or more) studies produce the same causal effect (within the limits of sampling error). An important implication of this framework is that it demonstrates how prospective and post-hoc research designs may be used to assess the replicability of effects and evaluate potential sources of treatment effect heterogeneity when study results do not replicate.
Project Activities: Using the framework, the project will develop easy-to-use tools and methods that improve the design, implementation, and analysis of replication studies. The proposed methodological tools for this project are fully grounded in two ongoing applications of replication studies. The first is a series of prospectively planned, highly controlled replication studies that take place among first year teacher preparation candidates at the Curry School of Education; the second is a post-hoc replication of an RCT field trial offering full and half-day preschool to low-income, mostly Hispanic families in Westminster, Colorado and Pomona, California. The researchers will determine replication success by looking at measures of teacher candidate performance in a simulated classroom environment, and by looking at measures of children's cognitive and social/emotional development in preschool settings. Data from the two replication studies will be used to demonstrate various research design approaches to replication, the feasibility of proposed methods on real world applications of replication, and how researchers may address plausible threats to replication design assumptions when they do occur in field settings. The project will also demonstrate methods for assessing replication success using correspondence tests.
Results from the methodological work will be used to build accessible products that will be widely disseminated to applied researchers. The products include: an online interactive research protocol for the planning and pre-registering of replication studies; example diagnostics measures for evaluating the extent to which replication design assumptions are met in field settings; and easy-to-use tools in Excel, R, and R-Shiny for calculating statistical power in replication designs, and for assessing replication success.
Related IES projects: Developing Infrastructure and Procedures for the Special Education Research Accelerator (R324U190001); Integrated Replication Designs for Identifying Generalizability Boundaries of Causal Effects (R305D220034)
Publications and Products
Anglin, K. L., & Wong, V. C. (2020). Using Semantic Similarity to Assess Adherence and Replicability of Intervention Delivery. . (EdWorkingPaper: 20–312). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/n5qj-7310
Anglin, K. L., Wong, V. C., & Boguslav, A. (2021). A natural language processing approach to measuring treatment adherence and consistency using semantic similarity. AERA Open, 7, 23328584211028615.
Anglin, K. L., Wong, V. C., Wing, C., Miller-Bains, K., & McConeghy, K. (2023). The validity of causal claims with repeated measures designs: A within-study comparison evaluation of differences-in-differences and the comparative interrupted time series. Evaluation Review, 0193841X231167672.
Cohen, J., Krishnamachari, A., & Wong, V. C. (2021). Experimental evidence on the robustness of coaching supports in teacher education. (EdWorkingPaper: 21–468). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/dgf9-ca95
Steiner, P. M., Sheehan, P., & Wong, V. C. (2023). Correspondence measures for assessing replication success. Psychological Methods.
Steiner, P. M., & Wong, V. C. (2019, December). Ways Out of the Replication Crisis. In ZPID-Kolloquium 2019, Trier, Germany. ZPID (Leibniz Institute for Psychology Information).
Steiner, P. M., Wong, V. C., & Anglin, K. (2019). A Causal Replication Framework for Designing and Assessing Replication Efforts. Zeitschrift für Psychologie, 227(4), 280–29.
Wong, V. C., Anglin, K., & Steiner, P. M. (2021). Design-based approaches to causal replication studies. Prevention Science, 1–16.