Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Feedback-and-Revision on Alternate Assessment based on Modified Achievement Standards in Mathematics
Center: NCSER Year: 2010
Principal Investigator: Laitusis, Cara Cahalan Awardee: Educational Testing Service (ETS)
Program: Systems, Policy, and Finance      [Program Details]
Award Period: 06/01/2010 – 5/31/2014 Award Amount: $1,284,995
Type: Measurement Award Number: R324A100065
Description:

Purpose: Alternate assessments based on modified academic achievement standards are designed to be challenging for eligible students and measure mastery of grade-level content, but are less difficult than regular grade-level assessment. Many alternate assessments have been created by modifying existing grade level assessments; where modifications include simplifying the language, decomposing subskills, and eliminating a distractor for multiple-choice items. However empirical research supporting these test changes is scarce. This research team will develop a new test administration format for alternative assessment based on modified achievement standards in mathematics that allow students to receive immediate feedback on their answers to test items and revise their answers for partial credit. The research team will examine whether providing immediate feedback and opportunity to revise their responses to test items to students with disabilities will improve the psychometric quality of the assessment.

Project Activities: Through a series of cognitive interview studies, the research team will first examine how students respond to different feedback-and-revision formats using a variety of item types and item formats. Think-aloud protocols will be used to assess the frustration and motivation levels of students with and without disabilities under different feedback-and-revision, item type, and item format conditions. The feedback-and-revision format found to be most helpful to students with disabilities will be included in a large experimental design study examining the relationships between the provision of feedback-and-revision and the psychometric comparability and accuracy of the assessment scores.

Products: The products of this research project include technical information on the reliability and validity of the feedback and revision method, presentations, and published reports.

Structured Abstract

Setting: The research will take place in school districts in Maryland and Georgia.

Population: The target population consists of eighth grade special education students who have persistent learning problems in mathematics. During the development stage of the feedback and revision methodology, the researchers will sample 70 students with disabilities (target sample) and 45 students without disabilities (general education sample). For the experimental study, 200 students with disabilities who are eligible for alternate assessment (target sample) and 200 students without disabilities will participate in the study (general education sample).

Research Design and Methods: Six cognitive interview studies will be conducted to determine how the target sample and the general education sample respond to two types of feedback-and-revision formats (answer-until-correct and second-chance), two different mathematics item types (numeric operation and geometry), and two different item formats (multiple-choice and constructed response). An experimental design study will follow to examine the impact of using a particular feedback-and-revision format on the psychometric comparability and accuracy of the assessment scores. The research team will examine whether providing this particular feedback-and-revision format to the target sample will improve their test performance, decrease their test anxiety, and increase their motivation.

Key Measures: This project will use the following instruments: (1) Grade 8 National Assessment of Education Progress (NAEP) mathematics released items, (2) state assessment items, (3) students performance on state assessments, (4) researcher-developed cognitive interviews protocols, (5) State-Trait Anxiety Inventory, and (6) student and teacher surveys.

Data Analytic Strategy: The researchers will summarize the student responses collected in the cognitive interviews. The researchers will use classical test theory and item response theory to analyze student response data and to evaluate the reliability and validity of the assessment scores under the feedback-and-revision condition. The researchers will use multivariate analysis of variance to examine whether students' test performance, test anxiety levels, and motivation levels are different under the no feedback-and-revision and feedback-and-revision condition for the target and the general education sample.

Intervention: Due to the nature of the research design, there is no intervention.

Control Condition: Due to the nature of the research design, there is no control condition.

Products and Publications

Journal article, monograph, or newsletter

Attali, Y., Laitusis, C., and Stone, E. (2015). Differences in Reaction to Immediate Feedback and Opportunity to Revise Answers for Multiple-Choice and Open-Ended Questions. Educational and Psychological Measurement, 76(5): 787–802. doi:10.1177/0013164415612548 Full text

Nongovernment report, issue brief, or practice guide

Johnstone, C., Figueroa, C., Attali, Y., Stone, E., and Laitusis, C. (2013). Results of a Cognitive Interview Study of Immediate Feedback and Revision Opportunities for Students With Disabilities in Large-Scale Assessments (Synthesis Report 92). Minneapolis, MN: University of Minnesota, National Center on Educational Outcomes. Full text


Back