Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Diagnostic Inventories of Cognition in Education (DICE)
Center: NCER Year: 2017
Principal Investigator: Harrison, Ashley J. Awardee: University of Georgia
Program: Cognition and Student Learning      [Program Details]
Award Period: 4 years (7/1/2017-6/30/2021) Award Amount: $1,399,746
Type: Measurement Award Number: R305A170441
Description:

Co-Principal Investigator: Hollylynne Lee and Roger Azevedo (North Carolina State University); Jessica Masters and Lisa Famularo (Research Matters)

Purpose: The Diagnostic Inventories of Cognition in Education (DICE) project team will develop a freely-available, web-based formative assessment system that will provide teachers with valid, timely, and actionable feedback about middle school students' understanding of probability and chance ("probabilistic reasoning").  The research team will design the DICE assessment as a concept inventory, that is, as a test to identify examinees who exhibit specific, persistent misconceptionsin their reasoning. In developing the DICE system, this research team will address three needs in the realm of mathematics instruction and assessment. First, they will develop a sound assessment of probabilistic reasoning—an important concept in many states' updated mathematics content standards that is difficult to assess via traditional methods. Second, researchers will develop a concept inventory to advance educational assessment methodology, which has traditionally employed psychometric models that are not well-suited to measure misconceptions based on learning theories. Third, the research team will develop a truly formativeassessment system that will support teachers by allowing them to reliably and accurately diagnose student misconceptions and plan accordingly for targeted instruction.

Project Activities: In collaboration with advisors, the research team will use an iterative development and evaluation process to create the DICE system and examine the validity of inferences from its results. The project will comprise multiple related studies to collect feedback and validity evidence about the assessment system and feedback reports, using techniques such as cognitive labs, eye-tracking studies, teacher focus groups, large-scale administrations, and small scale experimental intervention studies.

Products: Researchers will produce a fully developed and validated DICE assessment system with two critical components: (1) a web-based assessment system that adaptively administers diagnostic items and uses innovative psychometric methodology to provide feedback that is immediate, reliable, and multidimensional, and (2) feedback reports with interpretative guides to support the classroom use of the results. The team will also produce peer-reviewed publications.

Structured Abstract

Setting: The DICE system will be developed and validated in U.S. middle grades classrooms across a variety of U.S. geographic areas and urbanicities.

Sample: Across five different studies and data collection activities, the DICE project will engage over 5000 6th to 8th grade students (the majority of whom will participate in large-scale administrations of the DICE to pilot and calibrate its items) and over 60 educators teaching grades 6–8.

Assessment: The DICE will assess five student misconceptions related to understandings of probability and chance. These are: (1) the conjunction fallacy (the misconception that probabilities give the proportion of outcomes that will occur), (2) the outcome approach (the misconception that later random events compensate for earlier ones), (3) availability bias (the misconception that more trials increase the probability of certain outcomes), (4) representative bias (the misconception that every sample of a population must be representative of that population), and (5) equiprobability bias (the misconception that all outcomes are equally likely). To assess these misconceptions, the DICE will use selected response test questions with response options that specifically target or reflect certain misconceptions. Teacher feedback reports will produce student profiles reflecting the probability that individual students reason using each of the five misconceptions.

Research Design and Methods: Developing and validating the DICE will involve several studies and data collection activities. During the first year, the research team will develop approximately 75 assessment items, which will vary in their exact format and in which specific misconceptions they target. Once the items are developed the team will collect expert reviews from advisors and expert teachers, both to improve the items, and to provide test content validity evidence. The team will also conduct in-depth cognitive labs (48 students) and innovative laboratory studies including eye-tracking and affective state detection (50 students), both to improve the items and to collect response process validity evidence. Once the items have been reviewed and revised, the team will conduct large-scale administrations of the DICE items in years 2 and 3 (2500 students each) to assess the items' psychometric properties and provide internal validity evidence for the overall assessment. In years 2 and 3, the team will also conduct focus groups with approximately 50 practitioners who will assist in creating, critiquing, and revising the DICE feedback reports and interpretive guides. Finally, in years 3 and 4, the researchers will conduct two experimental studies (involving 12 teachers and 60 students) to collect external validity evidence to support claims that misconception diagnoses from the DICE are meaningful, coincide with expert diagnoses, and are sensitive to development and changes in probabilistic reasoning.

Control Condition: Due to the nature of this project, there is no control group.

Key Measures: The DICE itself is the primary measure used in this study. Other metrics include researcher-developed methods for analyzing eye-tracking data and identifying students' affective state during cognitive labs, and for diagnosing misconceptions during the experimental studies in the final years of the project.  

Data Analytic Strategy: Across their various studies and activities, the researchers will analyze results using a variety of methods, including qualitative and quantitative analyses, psychometric modeling with the Scaling Individuals and Classifying Misconceptions (SICM) model, and machine learning.

Related IES Projects: 
The Diagnostic Geometry Assessment Project (R305A080231)
Bridging the Gap: Applying Algebra Cognition Research to Develop and Validate Diagnostic Classroom Algebra Testlets (R305H040099)


Back