Skip to main content

Breadcrumb

Home arrow_forward_ios Information on ... arrow_forward_ios Addressing Prac ...
Home arrow_forward_ios ... arrow_forward_ios Addressing Prac ...
Information on ...
Grant Closed

Addressing Practical Problems in Achievement Gap Estimation: Nonparametric Methods for Censored Data

NCER
Program: Statistical and Research Methodology in Education
Program topic(s): Core
Award amount: $697,878
Principal investigator: Sean Reardon
Awardee:
Stanford University
Year: 2011
Award period: 2 years (04/01/2012 - 03/31/2014)
Project type:
Methodological Innovation
Award number: R305D110018

Purpose

The project developed and assessed a set of methods for estimating achievement gaps and their standard errors in order to address two practical problems affecting the use of achievement gap statistics: (1) the dependence of traditional gap statistics-mean differences and effect sizes-upon scaling decisions, and (2) the reporting of achievement data in categorical terms (e.g., below basic, basic, proficient, and advanced) rather than as test score results and their distributional statistics (e.g., means and standard deviations).

Achievement gaps are used to measure differences in group achievement as well as inequities in educational advantages and opportunities to learn. However, estimated achievement gaps may be sensitive to the choice of test metric and the choice of an achievement gap statistic. In addition, estimating achievement gaps may not be possible in those states and districts that only report proficiency levels. Without access to basic distributional statistics for test scores, research linking changes in state, district, or school policies and practices to levels and changes in achievement gaps may be compromised.

Project Activities

The project built off a recently developed, nonparametric approach (sometimes called a "metric free" approach) to measuring achievement gaps. This approach is based on the transformation invariant properties of nonparametric statistics and offers a range of theoretical benefits over traditional approaches, but requires information about the shape of the test score distributions in both groups (not merely their means and standard deviations). Moreover, methods are required for estimating standard errors of these metric free gap measures.

The project developed and tested a set of methods for estimating these metric free achievement gap measures from widely available categorical proficiency data. Both simulated and real datasets were used to identify the methods that minimize bias and maximize efficiency across a range of idealized and real-world scenarios. This work included a strategy for computing standard errors for these nonparametric achievement gap estimates. Simulated data and data from national and state assessment were then used to assess the different nonparametric methods developed to determine if they produce unbiased estimates of achievement gaps and accurate standard errors. The overall result was a set of practical guidelines for measuring achievement gaps based on categorical proficiency data and the development of free software to enable researchers and stakeholders to estimate these gaps.

People and institutions involved

IES program contact(s)

Allen Ruby

Project contributors

Andrew Ho

Co-principal investigator

Products and publications

Journal article, monograph, or newsletter

Fahle, E.M., and Reardon, S.F. (2018). How Much do Test Scores Vary Among School Districts? New Estimates Using Population Data, 2009-2015. Educational Researcher, 0013189X18759524.

Ho, A.D., and Reardon, S.F. (2012). Estimating Achievement Gaps From Test Scores Reported in Ordinal "Proficiency" Categories. Journal of Educational and Behavioral Statistics, 37(4): 489-517.

Ho, A.D., and Yu, C.C. (2015). Descriptive Statistics for Modern Test Score Distributions: Skewness, Kurtosis, Discreteness, and Ceiling Effects. Educational And Psychological Measurement, 75(3), 365-388.

Reardon, S. F. (2019). Educational opportunity in early and middle childhood: Using full population administrative data to study variation by place and age. RSF: The Russell Sage Foundation Journal of the Social Sciences, 5(2), 40-68.

Reardon, S.F. (2016). School Segregation and Racial Academic Achievement Gaps. RSF:The Russell Sage Foundation Journal of the Social Sciences, 2(5): 34-57.

Reardon, S.F., and Ho, A.D. (2015). Practical Issues in Estimating Achievement Gaps From Coarsened Data. Journal of Educational and Behavioral Statistics, 40(2), 158-189.

Reardon, S. F., Fahle, E. M., Kalogrides, D., Podolsky, A., & Zárate, R. C. (2019). Gender achievement gaps in US school districts. American Educational Research Journal, 56(6), 2474-2508.

Reardon, S. F., Kalogrides, D., & Ho, A. D. (2021). Validation methods for aggregate-level test scale linking: A case study mapping school district test score distributions to a common scale. Journal of Educational and Behavioral Statistics, 46(2), 138-167.

Reardon, S. F., Kalogrides, D., & Shores, K. (2019). The geography of racial/ethnic test score gaps. American Journal of Sociology, 124(4), 1164-1221.

Reardon, S.F., Kalogrides, D., Fahle, E. M., Podolsky, A., and Zárate, R.C. (2018). The Relationship Between Test Item Format and Gender Achievement Gaps on Math And ELA Tests In Fourth And Eighth Grades. Educational Researcher, 0013189X18762105.

Reardon, S.F., Shear, B.R., Castellano, K.E., and Ho, A.D. (2017). Using Heteroskedastic Ordered Probit Models to Recover Moments of Continuous Test Score Distributions From Coarsened Data. Journal of Educational and Behavioral Statistics, 42(1), 3-45.

Yee, D., and Ho, A. (2015). Discreteness Causes Bias in Percentage-Based Comparisons: A Case Study From Educational Testing. The American Statistician, 69(3), 174-181.

Related projects

The Effects of Racial School Segregation on the Black-White Achievement Gap

R305A070377

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Data and AssessmentsMathematicsPolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Summer Research Training Institute on Cluster-Rand...

July 06, 2026
Read More
Zoomed in IES logo
Tool/Toolkit

Resource to Support Selecting Effective Reading In...

Author(s): Megan Bogia, Kyla Brown, John Westall, Supriya Tamang, Allan Porowski
Read More
Zoomed in IES logo
Research insights

From Evidence to Classroom Practice: The Toolkit t...

February 02, 2026 by Riley Stone
Read More
icon-dot-govicon-https icon-quote