Project Activities
Recent value-added models attempt to deal with the bias introduced by student assignment by using "student fixed effects" models, but those only avoid bias if tracking is based solely on students' permanent characteristics. If classroom assignments are updated based on each year's performance, student fixed effects value-added models are unable to eliminate bias in value-added measures. This study will perform a formal statistical test that can distinguish the two forms of tracking. The first component of the proposed project will be to implement this test, using various samples and various measures of annual academic achievement. The project will carry out secondary analyses of data containing longitudinal records on all students attending North Carolina public schools that are linked to teacher data. Since preliminary evidence suggests that the permanent ability tracking restriction is unlikely to hold, a second component of the project will involve the development of alternative models that can identify teachers' causal effects in the presence of plausible real-world tracking.
Structured Abstract
Setting
The study will analyze North Carolina state public school data.
Sample
The sample includes all available 5th grade students in 2000-01 who could be matched to teachers, and for whom reading test gain scores since fourth grade can be computed. After excluding all students for whom a gain score could not be computed, those for whom the teacher could not be identified reliably, those whose class was a self-contained special education or honors class, students who were grade repeaters, those who were in a class with twelve or fewer students, and students who were in a class where there was only one fifth grade teacher in the school, the final sample is based on scores of 59,104 students (about 60% of the total fifth grade students), and data on their 3,013 teachers.
Research design and methods
The data analyses will test several value-added statistical models using varying assumptions and corresponding variables. The notion that ability tracking might be confounding typical value-added analyses will be demonstrated by regressing 4th grade student gain scores on 5th grade and 6th grade teacher characteristics — a non-causal sequence of factors that should show no relation to each other if true teacher effects are being estimated in the model, i.e., future teachers cannot have causal effects on past student achievement. Additional analyses will test whether tracking is based on more permanent student characteristics, "ability tracking," or on more transient changes, "performance tracking." The study will then test alternative assumption value-added models that more correctly specify the causal models. Finally, the study will attempt to show how these alternative models can be used for the various kinds of policy decisions about teacher quality to which value-added models are being applied.
Key measures
Key measures include end-of-year tests in reading, student demographic characteristics (e.g., race, free lunch), and teacher demographic and professional background characteristics (e.g., gender, race, majors, degrees, certifications).
Data analytic strategy
The analyses will involve basic and alternative value-added modeling (linear regression).
People and institutions involved
IES program contact(s)
Products and publications
Products: The products of this project will be a fully tested and validated approach to value-added analysis that reduces the bias in existing models due to non-random assignment of students.
Journal article, monograph, or newsletter
Rothstein, J. (2009). Student Sorting and Bias in Value-Added Estimation: Selection on Observables and Unobservables. Education Finance and Policy, 4(4): 537-571.
Rothstein, J. (2010). Teacher Quality in Educational Production: Tracking, Decay, and Student Achievement. The Quarterly Journal of Economics, 125(1): 175-214.
** This project was submitted to and funded under Teacher Quality: Reading and Writing in FY 2008.
Supplemental information
Dataset: The state data set for these analyses is assembled and distributed (in anonymous form) by the North Carolina Education Research Data Center (NCERDC).
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.