Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Developing a Formative Assessment o ...
Home arrow_forward_ios ... arrow_forward_ios Developing a Formative Assessment o ...
Information on IES-Funded Research
Grant Closed

Developing a Formative Assessment of Academic Reading Comprehension for English Language Learners: A Tool to Improve Teaching and Learning

NCER
Program: Education Research Grants
Program topic(s): English Learners Policies, Programs, and Practices
Award amount: $1,349,291
Principal investigator: Mikyung Wolf
Awardee:
Educational Testing Service (ETS)
Year: 2010
Award period: 2 years (07/01/2013 - 06/30/2015)
Project type:
Measurement
Award number: R305A100724

Purpose

While states annually implement English language proficiency assessments to measure the progress of English language learners (ELLs) English language development, there is a paucity of appropriate classroom-based assessments available to inform teaching and learning on an ongoing basis. ELL students deal with the dual challenges of acquiring English proficiency to handle academic materials and learning curriculum content. As children progress through the grade levels, course materials become increasingly complex. This project will develop and validate classroom-based, formative assessments of academic reading for ELL students in middle school.

Project Activities

Assessment development will begin by convening experts to define academic reading comprehension for ELLs and incorporate reading skills, cognitive strategies, and academic vocabulary knowledge. Experts will review appropriate curricular materials, state standards in reading, English language proficiency standards, and common core standards in English language arts for use in defining a conceptual framework for the assessments. Items for nine forms of the assessments will then be developed and field-tested based on this framework. Web-based reports for assessment results will be designed, and a manual for teachers in how to use the assessments to guide instruction will be prepared. Scoring rubrics for open-ended items will be designed and field-tested as well.

Structured Abstract

Setting

The study will be carried out in 20 eighth-grade English as Second Language (ESL) classrooms (comprised entirely of ELL students) in urban districts in Nevada and New Jersey.

Sample

Study participants will include approximately 400 eighth-grade students in intervention classes for ELL students. The majority of students will be Spanish-speaking.

Intervention

The assessment will be developed relying on strategies emerging from research on adolescent students' reading development, including ELLs. These strategies include: (1) the use of formative assessment to allow teachers to adapt their instruction based on students' needs; (2) the use of an academic language framework to help students understand complex secondary-level texts; (3) the use of explicit, direct reading comprehension strategy instruction; and (4) the use of cooperative learning to raise students' motivation and engagement for reading. The assessment will consist of three sets of assessments. Within each set, there will be three mini-assessments. In each set, the first two assessments will be completed collaboratively, with students working in pairs. For the purpose of measuring individual student progress, the third assessment in each set will be completed individually by each student. A web-based program to assist teachers in organizing assessment results and keeping track of student work will also be developed.

Research design and methods

An assessment framework will be developed through literature review, expert input, and review of promising existing frameworks for use in developing assessments of five subskills: general understanding, making connections, extracting content, comprehension strategies, and academic vocabulary. The framework will be used to develop item specifications that specify the item format and text selection to measure each attribute. Scoring rubrics will be developed and validated for open-ended items. Items will be tested in multiple cycles of field-testing and revision. The nine forms will then be finalized and pilot tested. A teacher working group will inform development of a professional development system and teacher manual to accompany the assessments. The validity of the assessments for improving instruction will be evaluated through a comparison of results with student performance on standardized reading achievement tests.

Control condition

There is no control condition.

Key measures

Nine forms of the assessment will be created for use as a formative tool in the classroom. District benchmark and state reading assessments will also be used to validate the utility of the formative assessments.

Data analytic strategy

Classical item statistics (including item difficulty and discrimination) as well as Item Response Theory (IRT)-based item statistics will be used to evaluate and improve items. Internal reliability will be calculated using Cronbach's alpha. For open-ended items, inter-rater reliability will be obtained through percent agreement, Cohen's Kappa, and correlation coefficients. To develop a common scale across all the forms for the assessment, item parameters will be calibrated and compared using IRT for items that are scored right/wrong, and a generalized partial credit model will be used for items in which students write a response. The reliability of each sub-score within each assessment will be calculated. Confirmatory factor analysis will be used to examine whether items relate to each other in the expected manner. Regression analysis will be conducted to examine the extent to which the students' performance on the developed assessments is associated with that on external measures of reading.

People and institutions involved

IES program contact(s)

Elizabeth Albro

Elizabeth Albro

Commissioner of Education Research
NCER

Products and publications

Products: A reliable and valid formative assessment tool designed specifically for use with ELLs consisting of nine different forms will be developed. Use of the assessments will be supported by a web-based score reporting tool and applications to support teachers in the use of the assessments for guiding instruction.

Book chapter

Shore, J. R., Wolf, M.K., O'Reilly, T., and Sabatini, J. (2017). Measuring 21st Century Reading Comprehension Through Scenario-Based Assessments. In M. K. Wolf and Y.G. Butler (Eds.), English Language Proficiency Assessments for Young Learners (pp. 234-252). New York, NY: Routledge.

Journal article, monograph, or newsletter

Heritage, M., and Chang, S. (2012). Teacher Use of Formative Assessment Data for English Language Learners. UCLA/National Center for Research on Evaluation, Standards, and Student Testing (CRESST): 1-12.

Shore, J. R., Wolf, M. K., and Heritage, M. (2016). A Case Study of Formative Assessment to Support Teaching of Reading Comprehension for English Learners. Journal of Educational Research & Innovation, 5(2): 1-19.

Nongovernment report, issue brief, or practice guide

Shore, J. R., Wolf, M. K., and Blood, I. (2013). English Learner Formative Assessment (ELFA): ELFA Teacher's Guide. Princeton, NJ: Educational Testing Service.

Wolf, M. K., Shore, J. R., and Blood, I. (2014). English Learner Formative Assessment (ELFA): A Design Framework. Princeton, NJ: Educational Testing Service.

Project website:

https://www.ets.org/research/topics/ella/elfa

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

CognitionData and AssessmentsLanguagePolicies and Standards

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote