Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Bayesian Inference for Experimental and Observational Studies in Education
Center: NCER Year: 2011
Principal Investigator: Kaplan, David Awardee: University of Wisconsin, Madison
Program: Statistical and Research Methodology in Education      [Program Details]
Award Period: 3 years Award Amount: $566,397
Type: Methodological Innovation Award Number: R305D110001
Description:

Purpose: The purpose of this project was to develop, apply, and disseminate Bayesian statistical tools for designs and analytic strategies used in empirical education research. This included: (1) randomized experimental designs; (2) quasi-experimental/observational designs; and (3) longitudinal studies. In addition, the project compared different types of prior information that may be encountered in practical circumstances, and provided guidance into best practices in Bayesian statistical modeling for the education sciences. Finally, this project used or developed open-source software to carry out this work and this software will be made readily available for future researchers.

Project Activities: The project had three parts. In Part I, the researchers examined the utility of Bayesian inference for randomized experiments in education. From a Bayesian perspective, experiments are dynamic, involving decisions that often rest on prior knowledge gleaned from previous studies (e.g., meta-analyses) or subjective judgments of experts. The project reviewed a select set of randomized experiments in education and reanalyze previous studies within a Bayesian framework. Special attention was paid to the use of Bayesian informative hypotheses, which allow for tests of directional hypotheses based on prior research and can mitigate the classical concern over Type I error control. Bayesian analysis of variance falls into the class of procedures that can be used to test informative hypotheses. A focused set of simulation studies was done to provide direct comparisons between Bayesian and classical approaches to issues of design and hypothesis testing. In addition, Bayesian analysis of variance was extended to cluster randomized designs.  Existing data from a randomized field trial of a large-scale reform that provided professional development in Los Angeles and from an evaluation of the Success-for-All intervention were used to supplement the simulation studies.

In Part II, the researchers examined a Bayesian approach to quasi-experimental/observational studies. The researchers focused particularly on the use of the propensity score adjustment procedures for addressing nonequivalence in such studies. Classical approaches to propensity score adjustment cannot account for uncertainties in model parameters or model choice, either of which can affect causal inferences after propensity score adjustment. The project compared the Bayesian propensity score adjustment with varying prior probabilities on model parameters that reflect reasonable choices found in education settings. This included situations in which little or no prior information is available. Specifically, a Bayesian approach to propensity score stratification, weighting, and optimal matching was developed. The researchers also examined Bayesian model averaging as a means of drawing strength from various realistic models for the propensity score. Analysis of data from the Early Childhood Longitudinal Study was used to supplement simulation studies.

Part III extended recent developments in Bayesian growth mixture modeling to the general growth mixture modeling case. In particular, researchers examined piecewise growth mixture modeling. They then examined the properties of the Bayesian piecewise growth mixture model under a variety of realistic conditions through simulation studies. Next, the Bayesian piecewise growth mixture modeling was applied to samples of data drawn from the World-class Instructional Design and Assessment Consortium that addresses response to intervention for English language learners.

Products and Publications

Book chapter

Kaplan, D., and Depaoli, S. (2013). Bayesian Statistical Methods. In T.D. Little (Ed.), The Oxford Handbook of Quantitative Methods, Volume 1: Foundations, (pp. 407–437). Oxford: Oxford University Press.

Kaplan, D., and Depaoli, S. (2012). Bayesian Structural Equation Modeling. In R. Hoyle (Ed.), Handbook of Structural Equation Modeling (pp. 650–673). New York: Guilford Publications, Inc.

Kaplan, D., and Depaoli, S. (2013). Bayesian Statistical Methods. In T.D. Little (Ed.), Oxford Handbook of Quantitative Methods in Psychology, Vol. 1. Oxford: Oxford University Press.

Kaplan, D., and Park, S. (2014). Analyzing International Large-Scale Assessment Data Within a Bayesian Framework. In L. Rutkowski, M. von Davier, and D. Rutkowski (Eds.), A Handbook of International Large-Scale Assessment: Background, Technical Issues, and Methods of Data Analysis (pp. 547–579). New York: CRC Press.

Journal article, monograph, or newsletter

Chen, J., and Kaplan, D. (2015). Covariate Balance in Bayesian Propensity Score Approaches for Observational Studies. Journal of Research on Educational Effectiveness, 8(2), 280–302.

Kaplan, D., and Chen, J. (2012). A Two-Step Bayesian Approach for Propensity Score Analysis: Simulations and Case Study. Psychometrika, 77(3): 581–609.

Kaplan, D., and Chen, J. (2014). Bayesian Model Averaging for Propensity Score Analysis. Multivariate Behavioral Research, 49(6): 505–517.

Kaplan, D., and Lee, C. (2016). Bayesian Model Averaging Over Directed Acyclic Graphs With Implications for the Predictive Performance of Structural Equation Models. Structural Equation Modeling: A Multidisciplinary Journal, 23(3), 343–353.

Kuger, S., Kluczniok, K., Kaplan, D., and Rossbach, H.G. (2016). Stability and Patterns of Classroom Quality in German Early Childhood Education and Care. School Effectiveness and School Improvement, 27(3), 418–440.

Park, S., and Kaplan, D. (2015). Bayesian Causal Mediation Analysis for Group Randomized Designs With Homogeneous and Heterogeneous Effects: Simulation and Case Study. Multivariate Behavioral Research, 50(3), 316–333.

Schoot, R., Kaplan, D., Denissen, J., Asendorpf, J.B., Neyer, F.J., and Aken, M.A. (2014). A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research. Child Development, 85(3), 842–860.


Back