Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Addressing Practical Problems in Ac ...
Home arrow_forward_ios ... arrow_forward_ios Addressing Practical Problems in Ac ...
Information on IES-Funded Research
Grant Closed

Addressing Practical Problems in Achievement Gap Estimation: Nonparametric Methods for Censored Data

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $697,878
Principal investigator: Sean Reardon
Awardee:
Stanford University
Year: 2011
Project type:
Methodological Innovation
Award number: R305D110018

Purpose

The project developed and assessed a set of methods for estimating achievement gaps and their standard errors in order to address two practical problems affecting the use of achievement gap statistics: (1) the dependence of traditional gap statistics-mean differences and effect sizes-upon scaling decisions, and (2) the reporting of achievement data in categorical terms (e.g., below basic, basic, proficient, and advanced) rather than as test score results and their distributional statistics (e.g., means and standard deviations).

Project Activities

The project built off a recently developed, nonparametric approach (sometimes called a "metric free" approach) to measuring achievement gaps. This approach is based on the transformation invariant properties of nonparametric statistics and offers a range of theoretical benefits over traditional approaches, but requires information about the shape of the test score distributions in both groups (not merely their means and standard deviations). Moreover, methods are required for estimating standard errors of these metric free gap measures.

People and institutions involved

IES program contact(s)

Allen Ruby

Associate Commissioner for Policy and Systems
NCER

Products and publications

Journal article, monograph, or newsletter

Fahle, E.M., and Reardon, S.F. (2018). How Much do Test Scores Vary Among School Districts? New Estimates Using Population Data, 2009-2015. Educational Researcher, 0013189X18759524.

Ho, A.D., and Reardon, S.F. (2012). Estimating Achievement Gaps From Test Scores Reported in Ordinal "Proficiency" Categories. Journal of Educational and Behavioral Statistics, 37(4): 489-517.

Ho, A.D., and Yu, C.C. (2015). Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects. Educational And Psychological Measurement, 75(3), 365-388.

Reardon, S. F. (2019). Educational opportunity in early and middle childhood: Using full population administrative data to study variation by place and age. RSF: The Russell Sage Foundation Journal of the Social Sciences, 5(2), 40-68.

Reardon, S.F. (2016). School Segregation and Racial Academic Achievement Gaps. RSF:The Russell Sage Foundation Journal of the Social Sciences, 2(5): 34-57.

Reardon, S.F., and Ho, A.D. (2015). Practical Issues in Estimating Achievement Gaps From Coarsened Data. Journal of Educational and Behavioral Statistics, 40(2), 158-189.

Reardon, S. F., Fahle, E. M., Kalogrides, D., Podolsky, A., & Zárate, R. C. (2019). Gender achievement gaps in US school districts. American Educational Research Journal, 56(6), 2474-2508.

Reardon, S. F., Kalogrides, D., & Ho, A. D. (2021). Validation methods for aggregate-level test scale linking: A case study mapping school district test score distributions to a common scale. Journal of Educational and Behavioral Statistics, 46(2), 138-167.

Reardon, S. F., Kalogrides, D., & Shores, K. (2019). The geography of racial/ethnic test score gaps. American Journal of Sociology, 124(4), 1164-1221.

Reardon, S.F., Kalogrides, D., Fahle, E. M., Podolsky, A., and Zárate, R.C. (2018). The Relationship Between Test Item Format and Gender Achievement Gaps on Math And ELA Tests In Fourth And Eighth Grades. Educational Researcher, 0013189X18762105.

Reardon, S.F., Shear, B.R., Castellano, K.E., and Ho, A.D. (2017). Using Heteroskedastic Ordered Probit Models to Recover Moments of Continuous Test Score Distributions From Coarsened Data. Journal of Educational and Behavioral Statistics, 42(1), 3-45.

Yee, D., and Ho, A. (2015). Discreteness Causes Bias in Percentage-Based Comparisons: A Case Study From Educational Testing. The American Statistician, 69(3), 174-181.

Related projects

The Effects of Racial School Segregation on the Black-White Achievement Gap

R305A070377

Supplemental information

Co-Principal Investigator: Ho, Andrew

Achievement gaps are used to measure differences in group achievement as well as inequities in educational advantages and opportunities to learn. However, estimated achievement gaps may be sensitive to the choice of test metric and the choice of an achievement gap statistic. In addition, estimating achievement gaps may not be possible in those states and districts that only report proficiency levels. Without access to basic distributional statistics for test scores, research linking changes in state, district, or school policies and practices to levels and changes in achievement gaps may be compromised.

The project developed and tested a set of methods for estimating these metric free achievement gap measures from widely available categorical proficiency data. Both simulated and real datasets were used to identify the methods that minimize bias and maximize efficiency across a range of idealized and real-world scenarios. This work included a strategy for computing standard errors for these nonparametric achievement gap estimates. Simulated data and data from national and state assessment were then used to assess the different nonparametric methods developed to determine if they produce unbiased estimates of achievement gaps and accurate standard errors. The overall result was a set of practical guidelines for measuring achievement gaps based on categorical proficiency data and the development of free software to enable researchers and stakeholders to estimate these gaps.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

MathematicsData and AssessmentsPolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote