|Title:||Exploring Computational Thinking: Applying Psychometric Analysis to Assess Relationships with Primary Cognitive Abilities and Malleability|
|Principal Investigator:||Feldon, David||Awardee:||Utah State University|
|Program:||Science, Technology, Engineering, and Mathematics (STEM) Education [Program Details]|
|Award Period:||3 years (07/01/2022 – 06/30/2025)||Award Amount:||$1,699,756|
Co-Principal Investigators: Tofel-Grehl, Colby; Grover, Shuchi; Chung, Gregory; Cai, Li
Purpose: Computational thinking (CT) has emerged as a major topic of interest in K-12 education, with widespread efforts to support students in developing this competency in multiple content areas (e.g., Next Generation Science Standards) without specifically tying it to learning coding or computer science. However, many facets of computational thinking are not clearly defined or understood, including (1) the extent to which it can be assessed independent of coding and (2) its relationship to other modes of reasoning (e.g., mathematical reasoning, spatial abilities). Therefore, it is not clear to what extent CT may represent a unique construct or may reflect an aggregation of other knowledge and mental abilities. Similarly, it is not known if facets of CT differ in their malleability and responsiveness to instruction. We will explore the correlative and psychometric relationships among established and novel assessments of CT, other modes of reasoning and basic mental abilities, and the relationships between instructional activities and CT. This effort will identify the extent to which CT assessments—singly and collectively—may identify unique variance in student performance that is not attributable to knowledge of coding, mathematics, or basic capacities such as working memory span, fluid ability, or spatial reasoning. Through this study, we can better understand the underlying dimensions of CT as a construct, the malleability of each dimension, and the basic cognitive skills that may underlie or support them. Gaining insight into these relationships will inform a number of next steps relevant for educational practice, including the development of more effective and valid CT assessments and the targeting of curriculum and instruction to maximize impact on malleable facets of the construct.
Project Activities: Implemented across the 7th and 8th grades in four diverse Utah school districts, this project consists of three major components of research activity. First, the researchers will implement a cognitive lab study to gauge the cognitive strategies invoked by CT assessment items. Second, the team will collect individual testing data on CT, spatial and fluid reasoning, working memory span, executive function, algebraic aptitude, and scientific reasoning. Third, the researchers will collect data from middle school classrooms that are intentionally integrating CT into the curriculum to assess opportunity-to-learn and malleability related to CT during classroom instruction. Data analyses will likewise occur in several phases.
Products: The research team will use findings to develop pedagogical strategies and tasks that can target malleable CT facets across the curriculum and inform the development of assessments that are designed to measure malleable facets of CT independent of the courses in which they are targeted. The researchers will also produce peer-reviewed publications and a final publicly accessible dataset.
Setting: The project will take place in four school districts in Utah that reflect a range of contexts from urban to rural. All districts have a substantial percentage of students who qualify for free- and reduced-lunch (23%-59%), notable levels of English Language Learners (4%-33%), and racial/ethnic diversity reflecting students of multiple backgrounds, including high proportions of Latinx and Native Americans (17%-62%).
Sample: The sample will consist of primarily of 7th and 8th grade students. The ethnicity of these students will reflect the school populations of each location.
Factors: The team will assess a range of factors in relation to students' CT abilities and gains including individual characteristics (i.e., spatial and fluid reasoning, working memory span, executive function, algebraic aptitude, scientific reasoning) and features of instruction (i.e., opportunity to learn).
Research Design and Methods: Researchers will collect data via 1) a cognitive lab study to gauge the cognitive strategies invoked by CT assessment items; 2) a descriptive study gathering individual testing data on CT to assess relevant cognitive factors; and 3) an observational study of middle school classrooms that integrate CT into the curriculum to assess opportunities to learn and malleability related to classroom instruction.
Control Condition: Due to the study design, there is no control condition.
Key Measures: The research team will examine computational thinking using the Computational Thinking test and items from the Bebras International Challenge on Informatics and Computational Thinking. Two components of executive function, cognitive flexibility and inhibitory control will be assessed using the National Institutes of Health Toolbox Cognition Battery (NIHTB-CB)—Dimensional Change Card Sort and the Flanker Inhibitory Control and Attention. Working memory capacity will be assessed using the List Sorting Working Memory Test (LSWMT) of the NIHTB-CB. Spatial ability will be assessed using the Spatial Reasoning Instrument. Two measures of NIHTB-CB, the Oral Reading Recognition Test (ORR) and the Picture Vocabulary Test (PV), will be used to assess participants' language ability. Two tasks will be used to assess fluid ability, the Maze task and the BackSpan. Two measures will be used to assess scientific reasoning, the Lawson's Test for Scientific Reasoning and the Scientific Reasoning Scale. Algebraic reasoning will be measured with the Iowa Algebra Aptitude Test and the Pre-Algebraic Number Skills and Concepts (PANSC) and Representing Relationships (RR).
Data Analytic Strategy: First, researchers will analyze all individual testing and situated CT proficiency data at the instrument level within a multitrait-multimethod (MTMM) framework to identify and eliminate sources of overlapping or measurement error variance that could hinder subsequent analyses. Second, the team will analyze all CT test data at the item level using multidimensional item response theory (MIRT) and diagnostic classification modeling (DCM) to evaluate the extent to which potential facets underlying CT item performance operate independently or in conjunction to influence the probability of correct responses. Third, they will assess students' responsiveness to instruction through analysis of pre-post gains on CT assessments, based on the premise that increased OTL will lead to greater performance gains.