Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Developing Reading Comprehension As ...
Home arrow_forward_ios ... arrow_forward_ios Developing Reading Comprehension As ...
Information on IES-Funded Research
Grant Closed

Developing Reading Comprehension Assessments Targeting Struggling Readers

NCER
Program: Education Research Grants
Program topic(s): Literacy
Award amount: $1,572,635
Principal investigator: John Sabatini
Awardee:
Educational Testing Service (ETS)
Year: 2004
Project type:
Measurement
Award number: R305G040065

Purpose

In the early 2000s, an estimated 26 percent of the 4.5 million 8th graders were below basic literacy. Among adults, about 40-44 million Americans had minimal or no proficiency in prose literacy. This project team addressed the pressing need for comprehension assessments for adolescent and adult struggling readers by developing a series of diagnostic assessments aimed at identifying struggling readers and then providing information about the specific skills or combinations of skills the readers lack. This tool enables teachers to focus instruction on the right skills and learners. The specific aims of the project were to (a) identify and quantify sources of difficulty for readers in texts, (b) develop assessments that can be used in education delivery settings to identify sources of reading comprehension difficulty, (c) identify and empirically examine which subsets of comprehension skills are linked to difficulties in understanding printed materials for individual struggling readers, and (d) determine how best to use this information to guide instruction.

Project Activities

The researchers applied natural language processing techniques to identify sources of text and task difficulty and used evidence-centered design modeling of multiple dimensions of student proficiencies. Together these techniques formed the foundation for developing and testing automated or assisted text and task generation and scoring tools and statistical profile and growth modeling. The researchers then applied and investigated technology-enhanced tools for administration, collection, and scoring of evidence.

Structured Abstract

Sample

The project study participants were readers spanning the developmental levels of high school and adult literacy students and included a range of skilled readers for comparison purposes. The participating schools had diverse, urban student populations with high percentages of struggling readers. The researchers regularly engaged and convened teachers to review, examine, and provide guidance concerning all phases of the assessment delivery system.

Research design and methods

The research plan proceeded in three phases. In phase I, the researchers applied natural language processing (NLP) techniques to identify sources of difficulty in a wide range of texts from existing corpora. They modeled the interactions of struggling reader profiles and developed task models using evidence-centered design (ECD) analysis techniques. Finally, they developed assessment tools, and piloted them to assess their viability. These activities continued iteratively throughout the project. In phase II, the researchers combined and administered tested task models to a larger sample of learners in a cross-sectional design to examine relationships between the components and profiles of learners. They used these findings to test and adjust the ECD model. In Phase III, they conducted a longitudinal, repeated measures design to examine how well the NLP tools predict text and task difficulty and how well the assessments accumulates individual profile information over time.

People and institutions involved

IES program contact(s)

Elizabeth Albro

Elizabeth Albro

Commissioner of Education Research
NCER

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Select Publications:

Book chapters

Mislevy, R.J., and Sabatini, J. (2012). How Research on Reading and Research on Assessment are Transforming Reading Assessment (or if They Aren't, how They Ought to). In J.P. Sabatini, E.R. Albro, and T. O'Reilly (Eds.), Measuring Up: Advances in how we Assess Reading Ability (pp. 119-134). Lanham, MD: Rowan and Littlefield.

Sabatini, J.P. (2009). From Health/Medical Analogies to Helping Struggling Middle School Readers: Issues in Applying Research to Practice. In S. Rosenfield, and V. Berninger (Eds.), Translating Science-Supported Instruction into Evidence-Based Practices: Understanding and Applying the Implementation Process (pp. 285-316). New York: Oxford University Press.

Journal articles

Deane, P., Sheehan, K.M., Sabatini, J., Futagi, Y., and Kostin, I. (2006). Differences in Text Structure and its Implications for Assessment of Struggling Readers. Scientific Studies of Reading, 10(3): 257-275.

O'Reilly, T., Sabatini, J., Bruce, K., Pillarisetti, S., and McCormick, C. (2012). Middle School Reading Assessment: Measuring What Matters Under an RTI Framework. Reading Psychology Special Issue: Response to Intervention, 33(1): 162-189.

Nongovernment peer-reviewed report

Sabatini, J.P., Bruce, K., and Steinberg, J. (2013). SARA Reading Components Tests, RISE Form: Test Design and Technical Adequacy. Princeton, NJ: Educational Testing Service.

Related projects

Assessing Reading for Understanding: A Theory-based, Developmental Approach

R305F100005

What Types of Knowledge Matters for What Types of Comprehension? Exploring the Role of Background Knowledge on Students' Ability to Learn from Multiple Texts

R305A150176

Exploring the onPAR Model in Developmental Literacy Education

R305A150193

Developing and Implementing a Technology-Based Reading Comprehension Instruction System for Adult Literacy Students

R305A200413

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Postsecondary EducationEducation TechnologyData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote