Skip to main content

Breadcrumb

Home arrow_forward_ios IES Makes 13 Awards to Statistical ...
Home arrow_forward_ios IES Makes 13 Awards to Statistical ...
Research insights

IES Makes 13 Awards to Statistical and Research Methodology in Education Projects

NCER
October 25, 2024
By: IES Staff

IES is pleased to announce the newest set of Statistical and Research Methodology in Education (Stats/Methods) investments: 13 projects, nine of which will create innovative methodological products and four of which will develop toolkits to help education scientists understand and apply recently developed methods in their work. This set of Stats/Methods projects will receive more than $9.3 million in funding over the next three years.   

Projects funded under the Stats/Methods program support the development of products (for example, new and improved methods, toolkits, guidelines, review papers, compendia, curated data resources, and software) that help education scientists as they strive for rigor in their research. The Stats/Methods program has funded several widely used statistical software packages, such as Stan, HLM, and Blimp. Stats/Methods projects have also produced papers and presentations that have advanced the theory and practice of randomized trials, psychometrics, and Bayesian statistics in education.

The latest awards from the Stats/Methods program focus on four different areas to support education research. Collectively, these 13 projects will result in innovative products, including templates, new methods, software, tools, practice guides, visual displays, databases, and language models that researchers can use to improve the rigor of education research.

  • Measurement and Value-Added Analyses Projects: These projects help education researchers develop better measurement models of learner, teacher, and school outcomes, so that they can better understand the effectiveness of educational interventions.
    • Resources to Standardize Cost Analysis and Cost-effectiveness Analysis of Educational Interventions is developing a template that will make it simple for education researchers to follow standards when analyzing cost outcomes of their interventions.  
    • Variational Methods and Factor Regularization: Analyzing Complex Large-Scale Assessments with High-Dimensional Covariates is developing methods for analyzing outcomes of national-scale assessments, such as NAEP, in subpopulations and where there are complex dependencies between test items.
    • Expanding the Functionality and Accessibility of Software for Diagnostic Measurement is developing user-friendly software to score educational tests using diagnostic classification models, where test-takers are classified as proficient or non-proficient on a set of skills.
  • Causal Inference Projects: These projects help education researchers better estimate the impact of interventions on learner, teacher, and school outcomes.
    • Design Comparable Incidence Rate Ratio (IRR) Effect Sizes for Count Data in Single-Case Experimental Designs is developing methods to help researchers figure out how to include single-case studies with count data in meta-analyses.
    • Methods for Investigating Causal Mechanisms in Single-Case Experimental Designs is developing methods to help researchers figure out how to meta-analyze contextual effects from studies with data on single participants.
    • Statistical Innovations for Balancing Weight Methods in Education Research is developing methods for analyzing quasi-experimental data that control for bias due to pre-intervention differences in a way that properly accounts for the tendency of students from the same school to share similarities that students from different schools do not.   
    • What, when, and for whom? Principled estimation of effect heterogeneity across multiple treatments, outcomes, and groups is developing tools to enable education researchers to better estimate how much a replication of their results is likely to differ from the original study by using Bayesian tree-based models. Bayesian tree-based models are very flexible; using them can prevent researchers from underestimating how different results will be in a replication.
    • Improving the Estimation of Site-Specific Effects and their Distribution in Multisite Trials: Practical Tools and Guidelines is developing methods for studying the statistical distribution of the effects of interventions across different sites in studies that sample from a large geographical area. This kind of analysis had only been possible using very restrictive assumptions on the data before. This project will explore how changing the assumptions affects interpretation and will identify the best alternative assumptions.
  • Experimental and Interventional Design and Planning in Context Projects: These projects help education researchers plan and design better experiments.
    • Propensity Score Matching in Multilevel Educational Settings: Review of Design Considerations and Approaches is developing a practice guide that will communicate best practices in applying propensity scores when designing or analyzing quasi-experiments.
    • The Higher Education Randomized Controlled Trials (THE-RCT) – Supporting High Quality and Efficient Postsecondary RCTs and Open Science Practices is developing a set of resources to enable education researchers to plan and design high-quality randomized controlled trials, from outcome selection and statistical power analysis to templates for Institutional Review Board forms and data agreements.
  • Data Tools, Computational Statistics, and Machine Learning Projects: These projects help education researchers to use novel methods from computational and data science so that they can better understand large, complex data sets, such as those gathered from online learning platforms.
    • Developing and Evaluating Tools for Communicating Statistical Evidence to Education Decision-Makers is developing and validating a novel, rigorous way to visually summarize studies for education clearinghouses by applying techniques from human-computer interaction and behavioral economics.
    • The Item Response Warehouse: Empowering data-driven psychometrics research is developing a database of education assessment data that will support development of new kinds of psychometric models, such as those that give more relevant scores by incorporating process data (for example, the order in which a student answered the test questions and the time spent on each answer).
    • Methods and Software to Classify College Courses at Scale is developing language models that will greatly increase the efficiency of institutional research projects by enabling automatic coding of course titles, numbers, and descriptions to the College Course Map, so that transcript data can be harmonized across institutions with minimal need for human coders, who require substantial training and effort to align course descriptions to a common codebook.

IES is looking forward to partnering with these project teams to advance education research, policy, and practice through the development and dissemination of innovative methods.


This blog was written by Charles Laurin (Charles.Laurin@ed.gov), NCER program officer.

Meet the Author

IES Staff

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value
icon-dot-govicon-https icon-quote