Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Reducing Bias and Improving Efficie ...
Home arrow_forward_ios ... arrow_forward_ios Reducing Bias and Improving Efficie ...
Information on IES-Funded Research
Grant Closed

Reducing Bias and Improving Efficiency of Estimated Teacher Effects from Value-Added Models

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $939,937
Principal investigator: J. R. Lockwood
Awardee:
RAND Corporation
Year: 2009
Project type:
Methodological Innovation
Award number: R305D090011

Purpose

Value-added models (VAMs) use longitudinal student-level test score data and complex statistical models to generate estimates of the effects of individual teachers on student learning. This holds the promise of providing fair comparisons of teachers teaching different types of students. These estimates have heightened interest in teachers' roles in student academic outcomes. There has been rapidly growing interest among policymakers to use these measures for high-stakes decisions about teachers including performance pay and tenure decisions. To support such uses, the measures must be accurate, unbiased, and precise-that is, they cannot systematically distort representations of the performance of individual teachers, and they must be reliable enough to support decision-making. However, no consensus has been reached on the best VAM methods. Three sources of bias in VAM teacher effects have repeatedly emerged as potential concerns: (1) non-random assignment of students to schools and teachers within schools; (2) mistaken assumptions about the persistence of teacher effects and other inputs to student achievement; and (3) inconsistency between achievement test scaling and modeling assumptions. This project entails three components aimed at improving VAM.

Project Activities

The first component intends to combine econometric and statistical modeling approaches to VAM by developing modifications to each class of models to address their respective potential sources of bias. The modifications to the econometric models will relax their restrictive assumptions about persistence and scaling while the modifications to the statistical models will relax their restrictive assumptions about selection of students to schools and teachers. The project's next step will be to test the modifications using both simulated and actual student achievement data and compare the four classes of models (standard econometric, modified econometric, standard statistical, modified statistical) via an extensive simulation study. This will assess their relative performance under a wide array of empirically motivated, data-generating models varying on their specifications of non- random assignment, persistence, and scaling.

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

Book chapter

Lockwood, J. R., and McCaffrey, D. F. (2014). Should Nonlinear Functions of Test Scores be used as Covariates in a Regression Model?. Value-added Modeling and Growth Modeling with Particular Application to Teacher and School Effectiveness. Information Age Publishing.

McCaffrey, D. F., Han, B., and Lockwood, J. R. (2014). Using Auxiliary Teacher Data to Improve Value-Added: An Application of Small Area Estimation to Middle School Mathematics Teachers. Value Added Modeling and Growth Modeling with Particular Application to Teacher and School Effectiveness. Information Age Publishing.

Journal article, monograph, or newsletter

Han, B. (2013). Conditional Akaike Information Criterion in the Fay-Herriot Model. Statistical Methodology, 11: 53-67.

Lockwood, J. R., and McCaffrey, D. F. (2014). Correcting for Test Score Measurement Error in ANCOVA Models for Estimating Treatment Effects. Journal of Educational and Behavioral Statistics, 39(1): 22-52.

McCaffrey, D. F., Lockwood, J. R., and Setodji, C. M. (2013). Inverse Probability Weighting with Error-Prone Covariates. Biometrika, 100(3): 671-680.

McCaffrey, D. F., Lockwood, J. R., Mihaly, K., and Sass, T. (2012). A Review of Stata Routines for Fixed Effects Estimation in Normal Linear Models. The Stata Journal , 12(3): 406-432.

McCaffrey, D.F., and Lockwood, J.R. (2011). Missing Data in Value-Added Modeling of Teacher Effects. Annals of Applied Statistics, 5(2): 773-797.

Supplemental information

Co-Principal Investigator: McCaffrey, Daniel

The project's second component is to explore a novel approach to VAM that uses propensity scores to adjust for pre-existing differences among students taught by different teachers. This method has been successful in other disciplines to estimate causal effects from observational data but has not been explored in VAM. By relaxing the strict reliance on parametric regression models, propensity score approaches to VAM have the potential to mitigate all three sources of bias. To this end, the project will: (1) use theoretical and empirical information to build flexible models for the probabilities of individual students being in a given teacher's class; (2) explore the balance on observable characteristics these procedures provide for teachers; (3) estimate individual teacher effects using a variety of approaches, including propensity score weighted means, different forms of regression adjustment, and doubly robust methods; and 4) compare estimated teacher effects and standard errors obtained from these approaches to other estimation approaches, exploring sources of any substantial differences to understand potential shortcomings of the different approaches (e.g., regression model misspecification) including when the propensity score methods suggest classrooms cannot be made comparable.

The project's third component will addresses the other known limitation of VAM estimates of teacher effects: low precision due to small samples of students taught by each teacher and to the extensive adjustments required to control for potential biases arising from differences among classrooms in their students' background variables and prior achievement. The project will adapt methods of small-area and shrinkage estimation to develop methods that use observable teacher characteristics (such as experience and credentialing) to improve the precision of VAM teacher measures that can be used across a broad array of VAM approaches. As part of this work, the project will (1)develop methods that use observable teacher characteristics to pool information across teachers as a means of increasing the precision of estimated teacher effects, and (2) use empirical analyses and simulation to assess the gains in precision provided by these methods.

The project will synthesize the findings from its three components in order to propose strategies for providing estimated individual teacher effects that best balance bias reduction and precision.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsMathematicsPolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote