Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Value-Added Models and the Measurem ...
Home arrow_forward_ios ... arrow_forward_ios Value-Added Models and the Measurem ...
Information on IES-Funded Research
Grant Closed

Value-Added Models and the Measurement of Teacher Quality: Tracking or Causal Effects?

NCER
Program: Education Research Grants
Program topic(s): Teaching, Teachers, and the Education Workforce
Award amount: $294,295
Principal investigator: Jesse Rothstein
Awardee:
National Bureau of Economic Research (NBER)
Year: 2008
Award period: 3 years 6 months (07/01/2008 - 12/31/2011)
Project type:
Measurement
Award number: R305A080560

Purpose

Teacher quality is increasingly recognized as a crucial component of the educational process. If schools are to hire, promote, or compensate on the basis of quality, they need accurate measures of teacher quality. One measure of a high-quality teacher is one who improves his or her students' achievement. However, this assertion presumes that we know how to measure a teacher's effect on student achievement. What is needed is a measure of the causal effect, distinct from the potentially confounding influences of students, families, friends, neighborhoods, and individual abilities. Typical value-added models (VAMs) measure teacher effects by student gain scores (the change in students' achievement test scores from one academic year to the next). The purpose of this project is to validate the use of value-added models for the assessment of teachers' impacts on student achievement, with a particular focus on the bias in value-added measures that can be induced by non-random assignment of students to teachers, such as ability tracking. If high- and low-ability students are tracked into different classes, the teacher of the high-ability class may appear to have higher value-added than the teacher of the low-ability class even if the teachers are of the same quality. In addition, the project will attempt to identify specific value-added models that are plausibly free of this bias.

Project Activities

Recent value-added models attempt to deal with the bias introduced by student assignment by using "student fixed effects" models, but those only avoid bias if tracking is based solely on students' permanent characteristics. If classroom assignments are updated based on each year's performance, student fixed effects value-added models are unable to eliminate bias in value-added measures. This study will perform a formal statistical test that can distinguish the two forms of tracking. The first component of the proposed project will be to implement this test, using various samples and various measures of annual academic achievement. The project will carry out secondary analyses of data containing longitudinal records on all students attending North Carolina public schools that are linked to teacher data. Since preliminary evidence suggests that the permanent ability tracking restriction is unlikely to hold, a second component of the project will involve the development of alternative models that can identify teachers' causal effects in the presence of plausible real-world tracking.

Structured Abstract

Setting

The study will analyze North Carolina state public school data.

Sample

The sample includes all available 5th grade students in 2000-01 who could be matched to teachers, and for whom reading test gain scores since fourth grade can be computed. After excluding all students for whom a gain score could not be computed, those for whom the teacher could not be identified reliably, those whose class was a self-contained special education or honors class, students who were grade repeaters, those who were in a class with twelve or fewer students, and students who were in a class where there was only one fifth grade teacher in the school, the final sample is based on scores of 59,104 students (about 60% of the total fifth grade students), and data on their 3,013 teachers.

Research design and methods

The data analyses will test several value-added statistical models using varying assumptions and corresponding variables. The notion that ability tracking might be confounding typical value-added analyses will be demonstrated by regressing 4th grade student gain scores on 5th grade and 6th grade teacher characteristics — a non-causal sequence of factors that should show no relation to each other if true teacher effects are being estimated in the model, i.e., future teachers cannot have causal effects on past student achievement. Additional analyses will test whether tracking is based on more permanent student characteristics, "ability tracking," or on more transient changes, "performance tracking." The study will then test alternative assumption value-added models that more correctly specify the causal models. Finally, the study will attempt to show how these alternative models can be used for the various kinds of policy decisions about teacher quality to which value-added models are being applied.

Key measures

Key measures include end-of-year tests in reading, student demographic characteristics (e.g., race, free lunch), and teacher demographic and professional background characteristics (e.g., gender, race, majors, degrees, certifications).

Data analytic strategy

The analyses will involve basic and alternative value-added modeling (linear regression).

People and institutions involved

IES program contact(s)

Elizabeth Albro

Elizabeth Albro

Commissioner of Education Research
NCER

Products and publications

Products: The products of this project will be a fully tested and validated approach to value-added analysis that reduces the bias in existing models due to non-random assignment of students.

Journal article, monograph, or newsletter

Rothstein, J. (2009). Student Sorting and Bias in Value-Added Estimation: Selection on Observables and Unobservables. Education Finance and Policy, 4(4): 537-571.

Rothstein, J. (2010). Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement. The Quarterly Journal of Economics, 125(1): 175-214.

** This project was submitted to and funded under Teacher Quality: Reading and Writing in FY 2008.

Supplemental information

Dataset: The state data set for these analyses is assembled and distributed (in anonymous form) by the North Carolina Education Research Data Center (NCERDC).

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsMathematicsPolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote