Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Scenario-Based Assessment in the Age of Generative AI: Making Space in the Education Market for Alternative AssessmentParadigm
Center: NCER Year: 2024
Principal Investigator: Sabatini, John Awardee: University of Memphis
Program: Transformative Research in the Education Sciences Grants Program      [Program Details]
Award Period: 3 years (07/01/2024 – 06/30/2027) Award Amount: $3,657,923
Type: Measurement and Development Award Number: R305T240021
Description:

Co-Principal Investigators: Deane, Paul; Mitros, Piotr; O'Reilly, Tenaha; Magliano, Joseph

Partner Institutions: Georgia State University; Educational Testing Service; Middle Tennessee State University; Capti; Mindtrust; Caimber; Workbay

Purpose: The purpose of this project is to create a generative artificial intelligence (gen-AI) enhanced authoring tool for scenario-based assessments (SBAs). SBAs place knowledge and skills into a scenario or practical context so that test-takers can be both observers of and active participants in their own performance, building on a body of IES funded investments in developing and validating SBAs.  This creates the opportunity for a reflective, metacognitive, and self-regulatory loop that enables instructors to use SBAs for formative assessment purposes while maintaining the psychometric properties necessary to evaluate student success and monitor the quality of instruction. Because SBAs are difficult to develop, college instructors struggle to develop and deploy SBAs in their courses. Recent advances in gen-AI make it possible to scale up and democratize SBA development, enabling postsecondary instructors to design and administer localized, personalized, and discipline appropriate performance assessments that provide better feedback, higher levels of adaptivity, and richer diagnostic information. The project will support the widespread use of gen-AI enabled SBAs in college courses by putting SBA development into the hands of instructors rather than a team of psychometricians and assessment developers.

Project Activities: The initial 3-year phase of the project will adopt a design-based research methodology. In phase 1, researcher-instructor led development teams will build SBA exemplars (not using the authoring tool) to support development of authoring and support system modules, while a first version of the framework is drafted. Phase 2 will include iterative refinements to the framework, SBAs, and authoring system design; a prototype of the instructor user interface will be developed; new instructors will be recruited to develop SBAs using them; and design-based research training workshops will be conducted to refine the system. In phase 3, the project team will build functionalities to enhance the user SBA authoring experience of novice instructors; integrate SBA exemplars as design templates; integrate secondary functionalities for expert users; and refine professional support systems.

Products: The project team will contribute to publications and professional conferences as data and evidence accrue from SBA studies, instructor experiences, and implementations. Starting in phase 2, the team will address higher-education users and clients by conducting training workshops at conferences and meetings or through synchronously online settings where they will (a) discuss and share the multimodal literacy framework, (b) demo the SBA prototype, (c) conduct hands on tutorials using the gen-AI SBA authoring system, and (d) provide access to the other professional support system resources.

Structured Abstract

Setting: The research activities will take place in universities in Tennessee and Georgia.

Sample: The sample will be instructors and students involved in the courses for which the SBAs will be developed.

Project Focus: Technology Product: The project team will develop an SBA authoring and professional support system to enable college instructors to create their own gen-AI enabled SBAs. Other deliverables include (a) multimodal literacy framework that defines the constructs that SBAs measure, to supports task design through an evidence-centered design process, and to increase the transparency of the work for all stakeholders; (b) a minimum of 12 fully operable SBAs for instructor and student use; and (c) a professional support system that demonstrates how to design, tailor, and deliver SBAs in college course.

Research Design and Methods: A design-based research methods paradigm will support the iterative development and refinement of the products and associated support services to users. Evaluation of SBA tasks will be guided by evidence-centered design principles and psychometric methods to assure that items and tasks meet appropriate standards for reliability, validity, and. The project team will also conduct analyses of potential market fit, research process integrity, and cost analyses.

Key Measures: Baseline studies will collect data on student academic literacy skills using ETS-developed SBAs and reading tests (Sabatini et al., 2019 ), along with measures of test anxiety and engagement used previously with college students (e.g., Magliano et al., 2020, 2023).

Data Analytic Strategy: The project team will take a mixed methodology approach, with most design-based research studies of usability, feasibility, and appeal collecting and analyzing qualitative data sources. They will analyze the student SBA task data using quantitative, psychometric/statistical models.

Cost Analysis: The cost analysis will follow SEER standards and be guided by the IES Cost Analysis Starter Kit (IES, 2020).

Related IES Projects : Validating Automated Measures of Student Writing and the Student Writing Process to Help Classroom Teachers Implement Formative Assessment Practices (R305A210297), Developing and Implementing a Technology-Based Reading Comprehension Instruction System for Adult Literacy Students (R305A200413), Exploring and Assessing the Development of Students' Argumentation Skills (R305A190242), Developing and Validating Web-administered, Reading for Understanding Assessments for Adult Education (R305A190522), What Types of Knowledge Matters for What Types of Comprehension? Exploring the Role of Background Knowledge on Students' Ability to Learn from Multiple Texts (R305A150176), Exploring the onPAR Model in Developmental Literacy Education (R305A150193), Linguistically-Informed Activity Generation Technology to Support English Learner Content Learning (R305A140472), A Technology-Rich Teacher Professional Development Intervention that Supports Content-Based Curriculum Development for English Language Learners (R305A100105), Assessing Reading for Understanding: A Theory-based, Developmental Approach (R305F100005), Measuring the Development of Vocabulary and Word Learning to Support Content Area Reading and Learning (R305A080647), Assessing Reading in the 21st Century Conference: Aligning and Applying Advances in the Reading and Measurement Sciences (R305U070002), Acquiring Research Investigative and Evaluative Skills (ARIES) for Scientific Inquiry (R305B070349), Developing Reading Comprehension Assessments Targeting Struggling Readers (R305G040065), Assessing Reading Comprehension with Verbal Protocols and Latent Semantic Analysis (R305G040055)


Back