IES Grant
Title: | Using Process Data to Characterize Response Profiles and Test-Taking Behaviors of Low-Skilled Adult Responders on PIAAC Literacy and Numeracy Items | ||
Center: | NCER | Year: | 2021 |
Principal Investigator: | Tighe, Elizabeth | Awardee: | Georgia State University |
Program: | Postsecondary and Adult Education [Program Details] | ||
Award Period: | 3 years (09/01/2021 – 08/31/2024) | Award Amount: | $1,003,729 |
Type: | Exploration | Award Number: | R305A210344 |
Description: | Co-Principal Investigator: He, Qiwei Purpose: The purpose of this project is to understand how adults with low basic skills (literacy and numeracy) interact with digital assessments to tease apart the roles of low basic skills, fluency with digital tools, and assessment design. What the research team learns will inform the development and use of digital assessments with these adults. According to a recent NCES report, roughly 19 percent of U.S. adults perform at or below basic literacy skills, and 29 percent perform at or below basic numeracy skills. When education or training programs attempt to assess these adults in order to determine how best to assist them or for accountability or education purposes, they often use digital-based assessments. As these digital assessments gain in frequency, the field must be able to determine whether adults' performance is influenced by the digital format itself. The researchers in this project will accomplish this by leveraging an existing assessment that includes measures of not only literacy and numeracy but also problem-solving in digital environments (e.g., navigating websites). Project Activities: The researchers will use data from a large-scale educational assessment, the 2012 U.S. Program for the International Assessment of Adult Competencies (PIAAC), to explore how low-skilled adults interact with different types of literacy and numeracy items on a digital assessment. More specifically, they will leverage the supplementary PIAAC process data (log files) to identify response profiles, engagement levels, and test-taking behaviors of low-skilled adults. The research team will also examine whether low-skilled adults have different response profiles based on item features (item types, item content, item response format), item accuracy, overall literacy and numeracy proficiency, demographic factors, and malleable factors (for example, reading and numeracy behaviors). Pre-registration Site: https://osf.io/zgdbn/ Products: With this information, the researchers will refine theoretical models on how low-skilled adults perform and interact with literacy and numeracy items on digital assessments. They will also conduct workshops and training for researchers interested in analyzing process data from large-scale digital assessments, produce peer-reviewed publications, and share programming codes for researchers, and create short reports, infographics, and blog posts for adult education practitioners. Structured Abstract Setting: This study will use a nationally representative U.S. sample of low-skilled adults from the supplementary 2012 PIAAC process data (log files). Sample: The U.S. 2012 PIAAC process data includes a nationally representative sample (collected in 2012) of adults aged 16- to 65-years old (total N = 4,061 across three cognitive domains – literacy, numeracy, problem solving in technology rich environments). The research team will focus on a subset of adults who score at or below Levels 1 and 2 (the lowest levels) on the PIAAC literacy (N = 1,133; 42 percent of the available literacy sample) and numeracy (N = 1,527; 56 percent of the available numeracy sample) scales. Malleable Factors: The research team will explore how test takers interact with digital assessments. For example, they will consider patterns of engagement, response times, and other test-taking behaviors and whether these vary based on different item characteristics (e.g., item content, difficulty, response type required). The researchers will also explore interactions among these behaviors, malleable self-reported background characteristics (such as at-home and at-work use of literacy, numeracy, and computer skills). They will also explore if variation arises from non- or less-malleable factors, such as demographics (age, race/ethnicity, native English proficiency), educational attainment, and employment. Research Design and Methods: The research team will spend the first year of the project extracting process data from the PIAAC platform, cleaning the process data, preparing a codebook of item features, recoding the process data files into meaningful process data sequences, and merging PIAAC process data with the PIAAC main data file (containing item accuracy, background factors). Once this process is complete, the researchers will conduct three lines of study. In Study 1, they will explore general response and engagement patterns (such as low-skilled adult respondents' skipping behaviors, total time, number of actions) across literacy and numeracy items. In Study 2, they will explore profiles of test-taking behaviors (sequences of actions, changing responses, re-visiting specific aspects, using help functions) on subsets of item types pre-defined by PIAAC, such as, item response formats (clicking, highlighting, numeric entry), item content (community, educational, personal, work topics), item format (print-like passages, graphs, tables, e-mails). In Study 3, they will explore more in-depth response, engagement, and test-taking behaviors on a single complex literacy item and a single complex numeracy item. Across all three studies the researchers will consider how interactions with items (general response, engagement, and test-taking behaviors) relate to accuracy (or inaccuracy) on items, overall literacy and numeracy performance, and background factors (demographics, education, employment, and malleable literacy, numeracy, and computer behaviors at home and at work). Control Condition: Due to the use of extant data, there is no control condition. Key Measures: The researchers will use up to 72 literacy items and 72 numeracy items from the supplementary 2012 U.S. PIAAC process data. They will also include total performance scores (plausible values) on the literacy and numeracy domains and background characteristics from the main 2012 U.S. PIAAC dataset. Data Analytic Strategy: For Study 1, the research team will cluster low-skilled adults into groups with similar behavioral patterns using sequence distance derived from the dynamic time warping method and use analysis of variance (ANOVA) models to explore the association between the behavioral clusters and literacy/numeracy ability as well as background variables. For Study 2, they will use time-weighted longest subsequence techniques to identify respondents' test-taking behaviors and engagement levels, which will help pinpoint the underlying reasons for incorrect responses. For Study 3, they will use feature generation and predictive feature selection techniques to make elaborations on how low-skilled adults interact with complex items. Related IES Projects: Identifying Risk Factors and Predictors of Literacy Skills for Adults Performing at the Lowest Levels of PIAAC in the US (R305A180299) Products and Publications ERIC Citations: Find available citations in ERIC for this award here. Select Publications: He, Q., Shi, Q., & Tighe, E. L. (2023). Predicting problem-solving proficiency with multiclass hierarchical classification on process data: A machine learning approach. Psychological Test and Assessment Modeling, 65(1), 145–177. Kaldes, G., Tighe, E. L., & He, Q. (in press). It's about time! Exploring time allocation patterns of adults with lower literacy skills on a digital assessment. Frontiers in Psychology: Educational Psychology, Vol 15. Zhang, S., Tang, X., He, Q., Liu, J., & Ying. Z. (2024). External correlates of adult digital problem-solving process: An empirical analysis of PIAAC PSTRE action sequences. Zeitschrift für Psychologie, 232(2), 120-136. doi.org/10.1027/2151-2604/a000554 Ulitzsch, E., He, Q., & Pohl, S. (2024). Innovations in exploring sequential process data. Zeitschrift für Psychologie, 232(2), 71–73. doi.org/10.1027/2151-2604/a000560 |
||
Back |