Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Project DIMES: Diagnostic Instrumen ...
Home arrow_forward_ios ... arrow_forward_ios Project DIMES: Diagnostic Instrumen ...
Information on IES-Funded Research
Grant Open

Project DIMES: Diagnostic Instrument for Morphology of Elementary Students

NCER
Program: Education Research Grants
Program topic(s): Literacy
Award amount: $1,290,952
Principal investigator: Anne Corinne Huggins- Manley
Awardee:
University of Florida
Year: 2019
Award period: 2 years 11 months (07/01/2019 - 06/30/2022)
Project type:
Measurement
Award number: R305A190079

Purpose

The purpose of the project is to develop a computer adaptive, diagnostic assessment of teachable morphological skills for students in grades 3 to 5. The development of morphological skills is essential to students' literacy growth. This is because knowledge of morphemes (for example, root words like nation, prefixes like inter, and suffixes like al) supports students' reading achievement by influencing their ability to decode and access the meaning of multisyllabic words which then supports their reading comprehension.

Project Activities

Researchers will develop the Diagnostic Instrument for Morphology of Elementary Students (DIMES), which can produce reliable diagnostic feedback to teachers about their students' morphological knowledge. The research team will rely on an evidence-centered design.

Structured Abstract

Setting

The study takes place in suburban elementary schools in Florida.

Sample

The study will include two samples, each of approximately 600 students from grades 3 to 5, that are nationally representative in terms of ethnicity, students with disabilities, English language learners, and free/reduced lunch status.

Assessment

Researchers will develop an assessment called DIMES, for use with upper elementary students, which can produce reliable diagnostic feedback to teachers about their students' morphological knowledge. Morphological knowledge consists of various teachable problem-solving skills, such as the analysis of word form, inference of meaning of word parts, and analogy to other similar words. The project will focus on three different kinds of morphemes: prefixes, suffixes, and roots (sometimes referred to as bases or stems). The tool will assess students' strengths and weaknesses in these areas so that teachers can identify underlying challenges to reading success and design instruction accordingly. Students will be administered the assessment using technology-based application.

Research design and methods

The team will develop the diagnostic assessment using evidence-centered design, along with multiple iterations of validation testing to address content validity, response process validity, internal structure validity, external criterion validity, fairness, and validity related to test use. In addition, they will use multiple methods to evaluate fairness with respect to content, statistical properties, and test use.

Control condition

There is no control condition for this study.

Key measures

To establish convergent validity, key measures include the PAL-II subtest measures of morphological decoding and phonological coding, Goodwin assessment of derivational morphological awareness, Gates MacGinitie reading vocabulary and reading comprehension tests, Wide Range Achievement Test of Spelling, and PAL-II subtest measure of phonological decoding.

Data analytic strategy

The research team will transcribe and analyze the interview data using an iterative constant comparison grounded theory approach to uncover themes related to teacher and student perspectives. They will complete two types of psychometric analyses. They will first assess diagnostic classification modeling (DCM) fit to the person and item data, and differential item functioning across groups of students. The researchers will use Bayesian posterior predictive model checking, which allows for the assessment of fit even though not all individuals were exposed to the same items and allows for differential item functioning analysis through matching based on equivalent classes. Next, they will assess the efficiency of the adaptive algorithms, which will include, but not be limited to, examinations of test length, content coverage, content balancing, and item usage rates.

People and institutions involved

IES program contact(s)

Christina Chhin

Education Research Analyst
NCER

Products and publications

Products: The project will produce a computer adaptive, diagnostic assessment tool (DIMES). Researchers will also produce peer-reviewed publications, and present at national conferences and to elementary school teachers and leaders.

Related projects

Project Coordinate: Increasing Coordinated Use of Evidence-Based Practices for Improving Word Study in an RTI Framework for Teams of 4th Grade Teachers

R324A170135

Morphological Awareness Computer Adaptive Testing Project

R305A150199

Supplemental information

Co-Principal Investigators: Benedict, Amber; Goodwin, Amanda; Templin, Jonathan

Researchers will complete pilot, field test, and validity studies during the assessment development and testing process. In Year 1, the researchers will conduct a response process validity study by sampling a representative portion of items from the item bank and presenting them individually to 33 students (11 each at grades 3 to 5) during think-aloud interviews. In Year 2, the researchers will pilot DIMES with approximately 600 students. The objective of the pilot study is to gather internal structure validity evidence and use it to develop and refine the finalized assessment, as well as to determine item parameters for the adaptive DIMES. In Year 3, the team will conduct a field test of DIMES with approximately 600 students. Students will take DIMES, and then take one to two external criterion assessments. The focus of the field test is to examine external criterion validity by strategically gathering, administering, and comparing DIMES assessment scores to previously developed assessments on various reading constructs.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Reading

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Data file

U.S. Program for the International Assessment of A...

Publication number: NCES 2025224
Read More
Zoomed in IES logo
Statistical Analysis Report

2024 NAEP Reading Assessment: Results at Grades 4 ...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024218
Read More
Zoomed in IES logo
Statistics in Brief

NAEP Reading 2024 State and District Snapshot Repo...

Author(s): National Center for Education Statistics (NCES)
Publication number: NCES 2024220
Read More
icon-dot-govicon-https icon-quote