Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Developing Enhanced Assessment Tool ...
Home arrow_forward_ios ... arrow_forward_ios Developing Enhanced Assessment Tool ...
Information on IES-Funded Research
Grant Closed

Developing Enhanced Assessment Tools for Capturing Students' Procedural Skills and Conceptual Understanding in Math

NCSER
Program: Special Education Research Grants
Program topic(s): Cognition and Student Learning in Special Education
Award amount: $1,599,999
Principal investigator: Brian Bottge
Awardee:
University of Kentucky
Year: 2015
Award period: 4 years (07/01/2015 - 06/30/2019)
Project type:
Measurement
Award number: R324A150035

Purpose

The goals of this grant were to develop, test, and refine a set of assessment tools for measuring the conceptual understanding and procedural math skills of middle school students with math difficulties (MD). Traditional measurement methods for assessing problem solving do not effectively capture what students with MD know and are able to do in mathematics. Despite best efforts on word problem-solving items, many are still unsure of what test items are asking. Test formats also do little to improve teachers' understanding of the kinds of errors that students make on fractions computation problems. This project developed mathematics tests to help students with MD gain a deeper understanding of the problem contexts and the questions posed and to uncover conceptual misunderstanding these students make when attempting to compute with fractions.

Project Activities

During the first year, the research team developed an Interactive Computer-Based Test, a Formative Fractions Assessment, and a Diagnostic Fractions Computation Test complete with test administration materials and scoring protocols for teacher use. During the next 2 years, researchers conducted multiple studies to determine the extent to which the new measures assessed student understanding relative to paper-and-pencil measures. Additional activities over the course of the project included developing a scoring guide for use with the oral test and the refinement and extension of psychometric models to aid in the interpretation of test scores and student misconceptions of math concepts.

Structured Abstract

Setting

The research took place in middle schools in Kentucky.

Sample

A total of 73 students with disabilities in grades 6 to 11 participated in the research. Eight teachers of resource special education classes also participated.

Research design and methods

In the first year, the research team collected usability and feasibility data on the measures using classroom observations, online teacher logs, and informal interviews with teachers and students. Subsequent studies involved quasi-experimental and randomized experimental designs to assess performance of students using computerized, interactive tests versus paper-and-pencil versions. Using these data, the research team created navigation maps to understand how students navigated the assessment items using the software during the test-taking process. They conducted simulation studies to assess the quality of the new psychometric methods for interpreting test scores.

Control condition

Due to the nature of the research design, there was no control condition.

Key measures

In addition to the tests developed as part of this project, the research team administered the Iowa Test of Basic Skills to assess fractions computation skills of students with disabilities.

Data analytic strategy

The project used predominantly item response theory (IRT) modeling, but also utilized Markov chain Monte Carlo estimation. Differential item function (DIF) analyses were conducted to help detect item bias.

Key outcomes

The main findings of this project, as reported by the principal investigator, are as follows:

  • Researchers designed oral assessment methods and found that the oral assessments tapped more sophisticated problem-solving skills of students with math disabilities than the paper-pencil tests.
  • In one condition, teachers used technology-assisted prompts to assess student performance and remediate errors (Fractions at Work, Technology-Assisted Prompts [FAW-R]). In the comparison condition (Fractions at Work, Basic Intervention [FAW-B]), teachers gave students the same items for assessing progress but used their own methods of reteaching. Most student performance scores increased from pretest to posttest, which suggests that the FAW methods were effective.
  • Although there was a wide variation in the performance among students who profited from instruction and those who did not, there were no statistically significant differences between the two experimental conditions.
  • Additional analyses suggest that scores from all students together (regardless of intervention method) showed significant gains on all three mathematics outcome measures, with effect sizes ranging from small to large. This supports the use of the basic instructional program curriculum researchers had used in prior studies.
  • There were no significant differences between the FAW-B and FAW-R groups. Although teachers in the FAW-B group did not have access to the technology-based enhancements in FAW-R, they used their own, informal ways of monitoring each student's progress. The advantage of small class size in the resource rooms afforded teachers opportunities to assess their students' thinking and make appropriate adjustments to their instruction.
  • Formative and summative assessments can help teachers provide more effective instruction for low-performing students in math. Teachers demonstrated the ability to use the item misconceptions identified in the developed assessments to remediate their students' computation skills.

People and institutions involved

IES program contact(s)

Sarah Brasiel

Education Research Analyst
NCSER

Project contributors

Allan Cohen

Co-principal investigator

Products and publications

ERIC Citations: Find available citations in ERIC for this award here.

Publicly available data: For sharing of data and the measures, contact Brian Bottge at bbott2@uky.edu.

Additional online resources and information: http://websedge.com/videos/cec_tv/#/

Journal articles

Suh, Y., Cho, S.-J., & Bottge, B. A. (2018). A multilevel longitudinal nested logit model for measuring changes in correct response and error types. Applied Psychological Measurement, 42, 73-88.

Lin, Q., Xing, K., & Park, Y. S. (2020). Measuring skill growth and evaluating change: Unconditional and conditional approaches to latent growth cognitive diagnostic models. Frontiers in Psychology, 11, Article 2205.

Bottge, B. A., Ma, X., Gassaway, L. J., Jones, M., & Gravil, M. (2021). Effects of formative assessment strategies on the fractions computation skills of students with disabilities. Remedial and Special Education, 42(5), 279-289.

Related projects

Evaluating the Efficacy of Enhanced Anchored Instruction for Middle School Students with Learning Disabilities in Math

R324A090179

Supplemental information

Measures: The research team produced fully developed versions of the Interactive Computer-Based Test (ICBT), a Formative Fractions Assessment (FFA), and a Diagnostic Fractions Computation Test (DFCT). The ICBT assesses students' problem-solving skills and includes interactive item and information clusters, tracking features, and information for teachers. Both the FFA and DFCT are computer-administered measures of students' fraction understanding with interactive features and detailed feedback provided to teachers on student understanding and student errors.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Mathematics

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Blue 3 Placeholder Pattern 1
Statistical Analysis Report

2024 NAEP Mathematics Assessment: Results at Grade...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024217
Read More
Zoomed in IES logo
Statistics in Brief

NAEP Mathematics 2024 State and District Snapshot ...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024219
Read More
Zoomed in IES logo
First Look / ED TAB

TIMSS 2023 U.S. Highlights Web Report

Author(s): Catharine Warner-Griffin, Grace Handley, Benjamin Dalton, Debbie Herget
Publication number: NCES 2024184
Read More
icon-dot-govicon-https icon-quote