Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Psychometric Models for 21st Centur ...
Home arrow_forward_ios ... arrow_forward_ios Psychometric Models for 21st Centur ...
Information on IES-Funded Research
Grant Closed

Psychometric Models for 21st Century Educational Survey Assessments

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $1,197,301
Principal investigator: Frank Rijmen
Awardee:
Educational Testing Service (ETS)
Year: 2011
Project type:
Methodological Innovation
Award number: R305D110027

Purpose

The project developed models for the statistical analysis of data from large-scale survey assessments including the National Assessment of Educational Progress (NAEP), Trends in Mathematics and Science Study (TIMSS), Progress in International Reading Literacy Study (PIRLS), and the Programme for International Student Achievement (PISA). These models reflect recent changes in assessment frameworks such as the increased utilization of technology and the increased integration of tasks. The psychometric models are characterized by a structured high dimensionality to closely mirror how recent assessment frameworks specify the relationship between tasks and underlying content domains and the cognitive processes required for solving these tasks. For example, reading item clusters are nested within text types (content) which are crossed with reading processes. As a result, the dimensions capturing individual differences related to reading item clusters are nested within the dimensions for text types. This was then crossed with the dimensions corresponding to reading processes. In addition, the models take into account the multilevel structure of the samples.

Project Activities

The project formulated multidimensional, item response theory models within the generalized linear and nonlinear mixed model framework. Graphical model theory was used to assess the computational complexity of the models and efficient maximum likelihood estimation methods were derived for those models for which the computational burden can be reduced by exploiting the conditional independence relations implied by the model. Models of various degrees of complexity were formulated: a confirmatory structure reflecting one item classification scheme (either content- or process-based), a confirmatory structure that reflects the cross-classification of items along both content domains and cognitive processes, and a confirmatory structure that reflects the cross-classification of items and the effect of item clusters. The models that allow for efficient, maximum-likelihood estimation will be applied to a NAEP or other large-scale educational survey assessment. The results were disseminated and the research software used to estimate the models was made available on a website dedicated to this project. The software was developed within a general framework that integrates mixed models with graphical models.

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

Journal article, monograph, or newsletter

Jeon, M., and De Boeck, P. (2016). A Generalized Item Response Tree Model for Psychological Assessments. Behavior Research Methods, 48(3), 1070-1085.

Jeon, M., and Rabe-Hesketh, S. (2016). An Autoregressive Growth Model for Longitudinal Item Analysis. Psychometrika, 81(3), 830-850.

Jeon, M., and Rijmen, F. (2016). A Modular Approach for Item Response Theory Modeling With the R Package Flirt. Behavior Research Methods, 48(2), 742-755.

Jeon, M., and Rijmen, F. (2014). Recent Developments in Maximum Likelihood Estimation of MTMM Models for Categorical Data. Frontiers in Psychology, 5, 269.

Jeon, M., Rijmen, F., and Rabe-Hesketh, S. (2013). Modeling Differential Item Functioning Using a Generalization of the Multiple-Group Bifactor Model. Journal of Educational and Behavioral Statistics, 38(1), 32-60.

Jeon, M., Rijmen, F., and Rabe-Hesketh, S. (2014). Flexible Item Response Theory Modeling With FLIRT. Applied Psychological Measurement, 38(5), 404-405.

Jeon, M., Rijmen, F., and Rabe-Hesketh, S. (2013). Modeling Differential Item Functioning Using a Generalization of the Multiple-Group Bifactor Model. Journal of Educational and Behavioral Statistics, 38(1): 32-60.

Rijmen, F. (2011). The Latent Class Model as a Measurement Model for Situational Judgment Tests. Psychologica Belgica, 51(3): 197-212.

Rijmen, F. (2011). Hierarchical Factor Item Response Theory Models for PIRLS: Capturing Clustering Effects at Multiple Levels. IERI Monograph Series: Issues and Methodologies in Large-Scale Assessments, 4, 59-74.

Rijmen, F., and Jeon, M. (2013). Fitting an Item Response Theory Model With Random Item Effects Across Groups by a Variational Approximation Method. Annals of Operations Research, 206(1): 647-662.

Rijmen, F., Jeon, M., von Davier, M., and Rabe-Hesketh, S. (2014). A Third-Order Item Response Theory Model for Modeling the Effects of Domains and Subdomains in Large-Scale Educational Assessment Surveys. Journal of Educational and Behavioral Statistics, 39(4), 235-256.

Rijn, P., and Rijmen, F. (2015). On the Explaining-Away Phenomenon in Multivariate Latent Variable Models. British Journal of Mathematical and Statistical Psychology, 68(1), 1-22.

Related projects

Developing Enhanced Assessment Tools for Capturing Students' Procedural Skills and Conceptual Understanding in Math

R324A150035

Generalized, Multilevel, and Longitudinal Psychometric Models for Evaluating Educational Interventions

R305D220020

Supplemental information

Co-Principal Investigator: Von Davier, Matthias

The project also developed alternatives to numerical integration for those models for which the computational burden remains high after exploiting the conditional independence relations. Both stochastic and variational approximation techniques were developed and evaluated using simulated data. Successful estimation methods were applied to a NAEP or other large-scale educational survey assessment. The results were disseminated through a presentation at a statistical or psychometric conference and research software for sampling-based and variational methods will be made available on the website.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsCognitionMathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote