Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Diagnostic Inventories of Cognition in Education (DICE)
Center: NCER Year: 2017
Principal Investigator: Harrison, Ashley J. Awardee: University of Georgia
Program: Cognition and Student Learning      [Program Details]
Award Period: 4 years (7/1/2017 – 6/30/2021) Award Amount: $1,399,746
Type: Measurement Award Number: R305A170441
Description:

Co-Principal Investigators: Lee, Hollylynne; Azevedo, Roger; Masters, Jessica; Famularo, Lisa

Purpose: The Diagnostic Inventories of Cognition in Education (DICE) project team developed a freely available, web-based assessment system to provide teachers with feedback about student performance in probabilistic reasoning.

Project Activities: The DICE project developed (1) an assessment for the construct of probabilistic reasoning, (2) an assessment system to diagnose student misconceptions to identify key concepts for which students require targeted instruction, and (3) methods for designing and scoring concept inventories.

Key Outcomes:

  • The project team developed two diagnostic assessments to measure students' understanding of probabilistic reasoning and to identify their misconception regarding it (available at https://coe.uga.edu/research/labs/dice/).
  • In support of teachers' use of the assessments, the project team developed a webpage for teachers to obtain the assessments and a webpage to upload student responses to the assessments and receive student and class feedback reports to identify areas for additional instruction.
  • The project team published findings regarding measuring students' probabilistic reasoning (Lee et al., 2023, Sanei and Lee 2021).

Structured Abstract

Setting: The DICE system was designed for use in U.S. middle grades classrooms.

Sample: Sixty-six students took part in the cognitive labs, and about 5000 students took part in the larger scale administration of the assessments.

Measure: The research team developed two assessments that target reasoning covered in the statistics and probability domains of most states' middle grades math standards as well as in the Common Core State Standards. The assessments were designed to be formative so that they can be used by teachers and students during instruction to inform ongoing teaching and learning.

Research Design and Methods: The research team iteratively drafted and revised items. Three phases for gathering evidence of validity were used: expert advisers, 4 rounds of cognitive labs in which 66 students provide responses and their reasoning behind them, and 2 larger scale administrations were done of items, the first to 999 students and the second included 2156 students for the first assessment and 2061 students for the second assessment.

Key Measures: The DICE itself was the primary measure used in this study. Other metrics included researcher-developed methods for analyzing eye-tracking data and identifying students' affective state during cognitive labs, and for diagnosing misconceptions during the experimental studies in the final years of the project.

Data Analytic Strategy: The expert review examined if each item aligned with a target construct and whether each response option mapped to target misconceptions. Through qualitative coding of the student responses obtained from the cognitive labs, the research team checked if students responded to items with aligned reasoning (either for the correct answer or an answer based on a misconception) or nonaligned reasoning (the reasoning used did not support the answer be it correct or based on a misconception). These data were used to revise the items to better ensure that students who understood the underlying probability concept answered the item correctly and that students used aligned reasoning when responding with the correct answer or based on a misconception. Results from the larger scale administration were analyzed using the Scaling Individuals and Classifying Misconceptions model and machine learning.

Related IES Projects: The Diagnostic Geometry Assessment Project (R305A080231), Bridging the Gap: Applying Algebra Cognition Research to Develop and Validate Diagnostic Classroom Algebra Testlets (R305H040099)

Publications and Products

Project website: https://coe.uga.edu/research/labs/dice/

Additional online resources and information:

  • https://dice.coe.uga.edu/ – website for teachers to upload student responses to the assessments and receive student and class feedback reports to identify areas for additional instruction

ERIC Citations: Find available citations in ERIC for this award here.

Select Publications:

Lee, H. S., Sanei, H., Famularo, L., Masters, J., Bradshaw, L., & Schellman, M. (2023). Validating a Concept Inventory for Measuring Students' Probabilistic Reasoning: The Case of Reasoning Within the Context of a Raffle. Journal of Mathematical Behavior, 71. doi.org/10.1016/j.jmathb.2023.101081

Sanei, H. S., & Lee, H. S. (2021). Attending to students' reasoning about probability concepts for building statistical literacy[Paper]. Proceedings of the Satellite Conference of the International Association for Statistical Education (IASE): Statistics Education in the Era of Data Science," edited by R. Helenius and E. Falck, International Association for Statistical Education, 2021, pp. 1–6.


Back