Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Improving Education Policy Analysis ...
Home arrow_forward_ios ... arrow_forward_ios Improving Education Policy Analysis ...
Information on IES-Funded Research
Grant Closed

Improving Education Policy Analysis and Practice by Addressing Test Score Measurement Error

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $751,674
Principal investigator: J. R. Lockwood
Awardee:
Educational Testing Service (ETS)
Year: 2014
Project type:
Methodological Innovation
Award number: R305D140032

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

Journal article, monograph, or newsletter

Lockwood, J.R., and Castellano, K.E. (2017). Estimating True Student Growth Percentile Distributions Using Latent Regression Multidimensional IRT Models. Educational and Psychological Measurement, 77(6), 917-944.

Lockwood, J.R., and Castellano, K.E. (2015). Alternative Statistical Frameworks for Student Growth Percentile Estimation. Statistics and Public Policy, 2(1): 1-9.

Lockwood, J.R., and McCaffrey, D.F. (2017). Simulation-Extrapolation With Latent Heteroskedastic Error Variance. Psychometrika, 82(3), 717-736.

Lockwood, J.R., and McCaffrey, D.F. (2016). Matching and Weighting With Functions of Error-Prone Covariates for Causal Inference. Journal of the American Statistical Association, 111(516), 1831-1839.

Lockwood, J.R., and McCaffrey, D.F. (2015). Simulation-Extrapolation for Estimating Means and Causal Effects With Mismeasured Covariates. Observational Studies, 1: 241-290.

Weiss, M.J., Lockwood, J.R., and McCaffrey, D.F. (2016). Estimating the Standard Error of the Impact Estimator in Individually Randomized Trials With Clustering. Journal of Research on Educational Effectiveness, 9(3), 421-444.

Supplemental information

Co-Principal Investigator: Daniel McCaffrey

Test scores are instrumental in education research and decision making. They are being used for a growing number of high-stakes decisions about students, teachers, and schools, and to measure the effectiveness of education policies and interventions. In observational studies where treatment and control groups of students typically differ on attributes related to achievement, test scores are often the key control variables used to adjust for pre-existing differences among groups. When test scores are used as covariates, test score measurement error can bias parameter estimates from statistical models. Observational analyses in education applications that use test scores as control variables, whether for the estimation of treatment effects, teacher value-added effects, student growth distributions, or peer effects, risk potentially large biases if measurement error is not addressed.

Tools and guidance for adjusting properly for measurement error are lacking. In this project, researchers will fill this gap in a number of ways. The research team will conduct case studies of teacher value-added estimation that apply different measurement error correction approaches to understand their behavior in real data and to answer outstanding questions about the properties and feasibility of these approaches The case studies will lead to new knowledge and methodological recommendations that will allow analysts to make more informed decisions during analyses of observational data. For the popular statistical software package R, researchers will construct a library that provides functions to support implementation of all common approaches to measurement error correction using test scores and conditional standard error of measure (CSEM) information. Education researchers will therefore have relatively user-friendly access to using the models and incorporating the recommendations derived from this project.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

MathematicsData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote