Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Innovative, Translational, and User ...
Home arrow_forward_ios ... arrow_forward_ios Innovative, Translational, and User ...
Information on IES-Funded Research
Grant Open

Innovative, Translational, and User-Friendly Tools for Comprehensive Statistical Model Evaluation

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $900,000
Principal investigator: Wesley Bonifay
Awardee:
University of Missouri, Columbia
Year: 2021
Project type:
Methodological Innovation
Award number: R305D210032

Purpose

A key component of any scientific undertaking is the construction of a model that explains the data. No model is an exact representation of the phenomena under investigation and, especially in the education sciences, useful models are often simplistic approximations of immensely complex processes. Goodness-of-fit (GOF) assessment provides a limited view of a model's usefulness. While GOF addresses the closeness of the model to the observed data, generalizability measures the potential fit of a model to unseen data samples that have been or will be generated by the same underlying processes that produced the observed data. To make model-based inferences more informative, defensible, and replicable, applied educational researchers should contextualize GOF by also quantifying the generalizability and complexity of their models.

Project Activities

The purpose of this project is two-fold: first, to develop the theoretical basis for an eventual software package by combining Bayesian, classical, and information-theoretic perspectives into a unified approach to statistical modeling. Then, after extensive Monte Carlo simulations to develop and test the unified approach, the research team will develop a user-friendly R package and Shiny user interface for CoSME (comprehensive statistical model evaluation).

Products and publications

Products: Additional products will include journal manuscripts, vignettes, and interactive tutorials, along with training workshops offered at conferences.

Publications:

Bonifay, W. (2022). Increasing generalizability via the principle of minimum description length. Behavioral and Brain Sciences, 45, Article E5 2022 Bonifay, W., & Depaoli, S. (2023). Model evaluation in the presence of categorical data: Bayesian model checking as an alternative to traditional methods. Prevention Science, 24(3), 467-479.

Bonifay, W., Winter, S. D., Skoblow, H. F., & Watts, A. L. (2024). Good fit is weak evidence of replication: increasing rigor through prior predictive similarity checking. Assessment, 10731911241234118.

Davis-Stober, C. P., Dana, J., Kellen, D., McMullin, S. D., & Bonifay, W. (2024). Better accuracy for better science... through random conclusions. Perspectives on Psychological Science, 19(1), 223-243.

Watts, A.L., Greene, A.L., Bonifay, W. et al. (2024). A critical evaluation of the p-factor literature. Nature Reviews Psychology, 3, 108-122.

Supplemental information

Co-Principal Investigator: Cai, Li

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

icon-dot-govicon-https icon-quote