Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Learning About Open Response Science Test Items and English Learners
Center: NCER Year: 2015
Principal Investigator: Noble, Tracy Awardee: Technical Education Research Centers, Inc. (TERC)
Program: Policies, Practices, and Programs to Support English Learners      [Program Details]
Award Period: 4 years (8/1/2015–7/31/2019) Award Amount: $1,596,743
Type: Exploration Award Number: R305A150218
Description:

Co-Principal Investigator: Ann Rosebery

Purpose: This project will investigate sources of difficulty for English Language Leaners (ELLs) that inhibit their capacity to demonstrate their knowledge of STEM content when assessed using items that require them to write an extended open response (in contrast to multiple-choice items). Open response items may unduly challenge ELLs due to such factors as the language demands of writing and the extra time that ELLs may require in writing a response. ELLs frequently perform particularly poorly on open-response items, and this lower performance makes a big contribution to the achievement gap observed between ELLs and non-ELLs.

Project Activities: Researchers will conduct a series of studies to develop a theoretical framework to explain the relationships between reading, writing, and STEM demands of open-response items for fifth grade ELLs and non-ELLs. Using data collected in the Massachusetts Comprehensive Assessment System (MCAS), researchers will also explore how student skills in reading, writing, English language proficiency, and performance on multiple choice STEM items is related to performance on open-response items. Results will inform both better instruction and assessment practices for ELLs. Researchers on this project conducted a similar study with funding from NCER to investigate sources of difficulty for ELLs on multiple-choice items on science assessments.

Products: The products of this project include preliminary evidence of potentially promising practices for assessing STEM knowledge of ELLs and peer-reviewed publications.

Structured Abstract

Setting: The project will take place in 15 elementary schools in 4 urban school districts in Massachusetts.

Sample: Quantitative studies will include responses for approximately 71,000 non-ELLs and 4,000 ELLs per year for 14 years on the fifth grade MCAS assessments. Qualitative studies will include 120 ELLs and 20 non-ELLs selected to reflect the diverse ethnic, linguistic, and socioeconomic communities in urban districts in Massachusetts.

Intervention: This project will examine both characteristics of open-response items and students in providing valid measurement of STEM content for ELLs. The development of a theoretical framework to describe these factors will inform both better instruction and assessment practices for ELLs.

Research Design and Methods: Researchers will conduct a series of five linked studies in this grant. Study 1 is an exploratory study to investigate the relationships between ELL and non-ELL students' test scores on English Language Arts and both the multiple-choice and open-response items on STEM assessments. Analyses will predict student performance on open-response items from English language proficiency, reading, writing, and STEM multiple-choice scores. Study 2 will develop hypotheses about the relationships between the reading, writing, and STEM demand of assessment items based on findings from Study 1 and a literature review. Researchers will develop a rubric for coding the reading, writing, and STEM demand of open-response items. Study 3 will test the hypotheses developed in Study 2 through first conducting bilingual cognitive labs to elicit students' thinking as they respond to open-response items. Quantitative data will be analyzed to identify items that contribute to differential performance for students with similar STEM knowledge. Study 4 will explore how student-level and item-level factors interact in contributing to differential performance of ELLs and non-ELLs on open-response items. Researchers will also reanalyze data collected in the cognitive labs to expand findings in Study 4. Study 5 will extend findings from these studies to explore the prevalence of features in open-response items that impact ELL performance in assessments in other states and provide support for generalizability of the theoretical framework.

Control Condition: Due to the nature of the research design, there is no control condition.

Key Measures: Student demographic data; Massachusetts Comprehensive Assessment System assessments in English language arts and STEM; English language proficiency.

Data Analytic Strategy: Quantitative studies will use multiple regression, multi-level modeling, and several approaches to estimating differential item functioning. Qualitative studies will use thematic and discourse analysis, interaction analysis, and grounded theory to identify and describe patterns across items and students.


Back