Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Assessing Generalizability and Vari ...
Home arrow_forward_ios ... arrow_forward_ios Assessing Generalizability and Vari ...
Information on IES-Funded Research
Grant Closed

Assessing Generalizability and Variability of Single-Case Design Effect Sizes Using Multilevel Modeling Including Moderators

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Early Career
Award amount: $224,997
Principal investigator: Mariola Moeyaert
Awardee:
State University of New York (SUNY), Albany
Year: 2019
Project type:
Methodological Innovation
Award number: R305D190022

Purpose

The increasing number of published single-case experimental design (SCED) studies in education sciences can be used to inform policy, research, and practice decisions. Multilevel meta-analysis is one technique to summarize the literature in the SCED field in a standardized, objective, reliable, and valid manner. A three-level meta-analysis is recommended to address the hierarchical structure of the SCED data with effect sizes clustered within cases and cases clustered within studies.

Project Activities

This project built on two previous IES grants that focused on the empirical validation of the multilevel framework to synthesize SCED effect sizes and provided evidence that the multilevel approach is a promising approach to estimate treatment effects across SCED studies. This project focused on practical questions about how to design a multilevel model that is powerful enough to generalize SCED treatment effects and explain variability in treatment effects by including moderators. The research team empirically validated the multilevel meta-analytic model including moderators to explain variability among effect sizes at the case and at the study level. The team developed and empirically validated power calculations to detect meaningful moderator effects by conducting large-scale Monte Carlo simulation studies and provided guidance for study design.

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Additional online resources and information:

  • Cacciotti, D., Moeyaert, M., Ferron, J., & Bursali, S. (2021). Single Case Design MVA (Version 1.0) [Mobile application].https://singlecasemva.app/
  • *Fingerhut, J., Marbou, K., & Moeyaert, M. (2020). Single-Case Metric Ranking Tool (Version 1.2) [Microsoft Excel tool]. https://www.doi.org/10.17605/OSF.IO/7USBJ
  • *Xu, X., & Moeyaert, M. (2021). PowerSCED (Version 1.0) [RStudio Shiny Tool]. https://okq9ht-mariolamoeyaert.shinyapps.io/PowerSCED/

Select Publications:

Declercq, L., Jamshidi, L., Fernández Castilla, B., Moeyaert, M., Beretvas, S. N., Ferron, J. M., & Van den Noortgate, W. (2022). Multilevel Meta-Analysis of Individual Participant Data of Single-Case Experimental Designs: One-Stage versus Two-Stage Methods. Multivariate behavioral research, 57(2-3), 298-317.

Epstein, L. H., Bickel, W. K., Czajkowski, S. M., Paluch, R. A., Moeyaert, M., & Davidson, K. W. (2021). Single case designs for early phase behavioral translational research in health psychology. Health psychology: official journal of the Division of Health Psychology, American Psychological Association, 40(12), 858-874.

Fingerhut, J., & Moeyaert, M. (2022). Selecting quantitative analysis techniques in single-case research through a user-friendly open-source tool. Frontiers in Education.

Fingerhut, J., & Moeyaert, M. (2022). Training individuals to implement discrete trials with fidelity: A meta-analysis. Focus on Autism and Other Developmental Disabilities, 37(4), 239-250.

Fingerhut, J., Moeyaert, M., Manolov, R., Xu, X., & Park, K. H. (2023). Systematic Review of Descriptions and Justifications Provided for Single-Case Quantification Techniques. Behavior Modification, 01454455231178469.

Fingerhut, J., Xu, X., & Moeyaert, M. (2021a). Selecting the proper Tau-U measure for single-case experimental designs: Development and application of a decision flowchart. Evidence-Based Communication Assessment and Intervention, 15(3), 99-114.

Fingerhut, J., Xu, X., & Moeyaert, M. (2021b). Impact of within-case variability on Tau-U indices and the hierarchical linear modeling approach for multiple-baseline design data: A Monte Carlo simulation study. Evidence-Based Communication Assessment and Intervention, 15(3), 115-141.

Jamshidi, L., Fernández-Castilla, B., Ferron, J. M., Moeyaert, M., Beretvas, S. N., & Van den Noortgate, W. (2020). Multilevel meta-analysis of multiple regression coefficients from single-case experimental studies. Behavior Research Methods, 52(5), 2008-2019.

Manolov, R., Moeyaert, M., Fingerhut, J. (2021) A priori metric justification for the quantitative analysis of single-case experimental data. Perspectives on Behavior Science.

Miocevic, M., Klaassen, F., Geuke, G., Moeyaert, M., & Maric, M. (2020). Using Bayesian methods to test mediators of intervention outcomes in single case experimental designs. Evidence-based Communication Assessment and Intervention, 14(1-2), 52-68.doi.org/10.1080/17489539.2020.1732029

Miocevic, M., Klaassen, F., Moeyaert, M., & Geuke, G. G. (2023). Optimal Practices for Mediation Analysis in AB Single Case Experimental Designs. The Journal of Experimental Education, 1-18.

Moeyaert, M. (2022). Quantitative synthesis of personalized trials studies: Meta-analysis of aggregated data versus individual patient data. Harvard Data Science Review, (Special Issue 3).

Moeyaert, M., Akhmedjanova, D., Ferron, J., Beretvas, S. N., & Noortgate, W. V. den. (2020). Effect size estimation for combined single-case experimental designs. Evidence Based Communication Assessment and Intervention, 14(1-2), 28-51.

Moeyaert, M., Bursali, S, & Ferron, J. (2021). SCD-MVA: A mobile application for conducting single-case experimental design research during the pandemic. Special issue "COVID-19 and Human Behavior with Emerging Technologies". Human Behavior and Emerging Technologies, 3(1),75-96.

Moeyaert, M., & Dehghan-Chaleshtori, M. (2023). Evaluating the design and evidence of single-case experimental designs using the What Works Clearinghouse standards. Evidence-Based Communication Assessment and Intervention, 17(1), 78-96.

Moeyaert, M. & Yang, P. (2021). Assessing generalizability and variability of single-case design effect sizes using two-stage multilevel modeling including moderators. Behaviormetrika, 48, 207-229.

Moeyaert, M., Yang, P., & Xu, X. (2022). The power to explain variability in intervention effectiveness in single-case research using hierarchical linear modeling. Special Issue. Perspectives on Behavior Science, 45(1),13-35.

Moeyaert, M., Yang, P., Xu, X., & Kim, E. (2020). Characteristics of moderators in meta-analyses of single-case experimental design studies: A systematic review. Behavior Modification.

Virues-Ortega, J., Moeyaert, M., Sivaraman, M, & Fernández Castilla, B. (2021). Quantifying outcomes in applied behavior analysis. In Handbook of Applied Behavior Analysis: Integrating Research into Practice. New York: Springer.

Related projects

Multilevel Synthesis of Single-Case Experimental Data: Further Developments and Empirical Validation

R305D110024

Multilevel Modeling of Single-subject Experimental Data: Handling Data and Design Complexities

R305D150007

Supplemental information

  • The general multilevel approach without the inclusion of moderators was validated and a comparison made between the multilevel model combining raw single-case data and the multilevel meta-analytic model combining single-case effect sizes (Declercq et. al., 2022; Jamshidi et. al., 2020; Moeyaert, 2022, Virues-Ortega et. al., 2021).
  • The most appropriate multilevel meta-analytic framework to synthesize SCED single-case experimental design effect sizes with the inclusion of moderators was identified, conceptually and with an empirical illustration, as having the possibility to include participant characteristics (second level of the model) and study level characteristics (third level of the model) (Moeyaert & Yang, 2021).
  • Two systematic reviews were completed to identify the literature on the characteristics of moderators used in meta-analysis of SCEDs (Moeyaert, Yang, Xu, & Kim 2020).
  • An investigation of the power of a two-level HLM model to estimate intervention and moderator effects found that (1) larger values for the true effects and the number of participants result in a higher power and (2) the more moderators added to the model, the more participants are needed to detect the effects with sufficient power. For example, when a model includes 3 moderators at least 20 participants are required to capture the intervention effect and moderator effects with sufficient power while when 1 moderator is included, 7 participants are sufficient (Moeyaert, Yang, & Xu, 2022).
  • The researchers developed a Shiny App (called PowerSCED) to assist applied SCED researchers in conducting an a priori power analysis for the design of their single-case study (Xu & Moeyaert,2022).
  • Regression-based effect-size estimation for combined single-case experimental designs was examined (Moeyaert, Ferron, Beretvas & Noortgate, 2020).
  • The use of Bayesian piecewise regression analysis for mediation analysis in SCEDs was illustrated (Miocevic, Klaassen, Geuke, Moeyaert, & Maric, 2020).
  • Guidelines for metric selection were released (Manolov, Moeyaert, Fingerhut, 2021).
  • A point and click tool for selecting quantitative analysis techniques in single-case research was released (Fingerhut & Moeyaert, 2022).
  • A mobile application, SCD-MVA was released to assist in the design of an SCED, data gathering, data analysis, and remote collaboration (Moeyaert, Bursali, & Ferron, 2021).

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Mathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Blue 3 Placeholder Pattern 1
Statistical Analysis Report

2024 NAEP Mathematics Assessment: Results at Grade...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024217
Read More
Zoomed in IES logo
Statistics in Brief

NAEP Mathematics 2024 State and District Snapshot ...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024219
Read More
Zoomed in IES logo
First Look / ED TAB

TIMSS 2023 U.S. Highlights Web Report

Author(s): Catharine Warner-Griffin, Grace Handley, Benjamin Dalton, Debbie Herget
Publication number: NCES 2024184
Read More
icon-dot-govicon-https icon-quote