Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios SimScientists Assessment System
Home arrow_forward_ios ... arrow_forward_ios SimScientists Assessment System
Information on IES-Funded Research
Grant Closed

SimScientists Assessment System

NCER
Program: Education Research Grants
Program topic(s): Science, Technology, Engineering, and Mathematics (STEM) Education
Award amount: $1,599,764
Principal investigator: Edys Quellmalz
Awardee:
WestEd
Year: 2012
Project type:
Measurement
Award number: R305A120390

Purpose

To date, technology has been used mainly to support the logistics of assessment administration and scoring. However, new technology has also shown promise as a tool for the development of measures of complex learning that can be useful for instruction. In particular, computer simulations are becoming a component of large-scale science assessments such as the Programme for International Student Assessment, the National Assessment of Educational Progress, and the assessments aligned with the Next Generation Science Standards are likely to include simulations. There is a need, then, to develop and validate simulation-based science assessments designed for classroom, district, and state level use.

Project Activities

Following a three phase process of development, testing, and refinement, the researchers will develop and validate a set of simulation-based assessments for the Life Science strand of the SimScientists Assessment System. This strand will encompass three units taught in middle school: Cells, Human Body Systems, and Ecosystems.

Structured Abstract

Setting

The study will be conducted in urban, suburban, and rural middle schools in California, Massachusetts, Nevada, North Carolina, and Utah.

Sample

The study sample will consist of approximately 40 middle school teachers and 1,000 students.
Assessment
The researchers will develop and validate a vertically aligned set of simulation-based assessments for the Life Science strand encompassing three units taught in middle school: Cells, Human Body Systems, and Ecosystems. Both curriculum-embedded formative assessments intended to monitor progress and summative end-of-unit benchmark assessments will be developed. These simulation-based assessments will be 45-minute problem-based, multi-task inquiries. In addition, for each of three Life Science units, the researchers will develop signature tasks (i.e., interactive computer simulations that represent significant, recurring problems addressed by scientists and to be learned by students). The signature tasks will be shorter, requiring about 5–10 minutes to complete a set of items. The six signature tasks, two for each of the three Life Science units, will be combined to create the Life Science year-end test, which can be used as a proxy for an interactive computer task component of a district or state test.

Research design and methods

The assessments will undergo three phases of development, testing, and refinement. In Phase 1, the researchers will work with middle school teachers to co-develop simulation assessment components (unit embedded and benchmark simulation assessments) for the Cells and Human Body Systems units. In addition, the researchers will develop the pre-test, and post-test forms for Cells and Human Body Systems that will serve as external measures. Six signature tasks, two from each of three units, will be developed. The signature tasks will be short, 5–10 minute simulation-based tasks designed to test key ideas and inquiry practices. A year-end Life Science test will be developed and will be composed of signature tasks from each of the three Life Science units and a posttest of conventional items covering the content and inquiry targets in the three units. Once the newly programmed assessments have passed through the quality assurance screening, cognitive laboratory studies with two teachers and five middle school students will be conducted. The teachers and students will work through the assessments, thinking aloud as they respond to the tasks and questions in the benchmark assessments. Following the first round of cognitive labs, the teachers will try out all of the new assessment components in their classes to examine implementation logistics and technology usability. The two teachers will have four science classes of approximately 25 students per class. Four students, one from each of the four classes, will participate in a second round of cognitive labs of the embedded and benchmark assessments to more closely examine feasibility, usability, and construct validity of the tasks and items.

Control condition

For cross-validation purposes, classrooms using the full SimScientists assessment suite will be compared to classrooms using only the year-end Life Science test.

Key measures

The key measures for the study include students' responses on the developed assessments; student and teacher interviews; observations; and responses to questionnaires.

Data analytic strategy

Analyses of reliability and validity will be carried out using confirmatory factor analysis, stratified coefficient alpha tests, uni- and multi-dimensional item response theory, and difficulty estimates. In addition, differential item functioning analyses will be conducted examining the effect of gender, ethnicity and EL status, and subgroup analyses to examine results for low performing English learners and students with disabilities.

People and institutions involved

IES program contact(s)

Christina Chhin

Education Research Analyst
NCER

Products and publications

Products: The outcomes of the project include a computerized, fully developed and validated simulation-based assessment system focusing on Life Science at middle school. Peer-reviewed publications will also be produced.

Book chapter

Davenport, J.L., and Quellmalz, E.S. (2017). Assessing Science Inquiry and Reasoning Using Dynamic Visualizations and Interactive Simulations. In R. Lowe, and R. Ploetzner (Eds.), Learning From Dynamic Visualizations: Innovations in Research and Practice (pp. 203-232). New York: Springer.

Quellmalz, E. S., and Silberglitt, M. D. (2017). Affordances of Science Simulations for Formative and Summative Assessment. In H. Jiao and R.W. Lissitz (Eds.), Technology Enhanced Innovative Assessment: Development, Modeling, and Scoring From an Interdisciplinary Perspective (pp. 71-94).

Quellmalz, E.S., and Silberglitt, M.D. (2017). Simscientists: Affordances of Science Simulations for Formative and Summative Assessment. In H. Jiao and R.W. Lissitz (Eds.), Technology Enhanced Innovative Assessment: Development, Modeling, and Scoring From an Interdisciplinary Perspective (pp. 71-94). Information Age Publishing.

Quellmalz, E.S., Silberglitt, M.D, Buckley, B.C., Loveland, M.T., and Brenner, D.G (2016). Simulations for Supporting and Assessing Science Literacy. In Y. Rosen, S. Ferrara, and M. Mosharraff (Eds.), Handbook of Research on Technology Tools for Real-World Skill Development (pp. 191-229).

Quellmalz, E.S., Silberglitt, M.D., Buckley, B.C., Loveland, M.T., and Brenner, D. (2016). Simulations for Assessing and Supporting Science Literacy. In Y. Rosen, S. Ferrara, and M. Mosharraf (Eds.), Handbook of Research on Computational Tools for Real-World Skill Development (pp. 191-229). Hershey, PA: IGI Global.

Related projects

SimScientists: Interactive Simulation-Based Science Learning Environments

R305A080614

Supplemental information

Co-Principal Investigators: Barbara Buckley, Mark Loveland, Matt Silberglitt, and Daniel Brenner

In Phase 2, the researchers will pilot test and collect evidence in multiple states (Massachusetts, North Carolina, Nevada, and Utah) for initial validation of the unit assessments and the year-end Life Science test to establish their technical quality and feasibility. Pilot tests will take place in the classrooms of 10 middle school Life Science teachers. For Sample 1, it is estimated that each of the 5 teachers will have four classes of roughly 25 students, for a sample of approximately 500 students piloting the full suite of assessments and the year-end Life Science test. For Sample 2, a different set of 5 teachers with approximately 500 students will use only the year-end Life Science test of signature tasks and the posttest.

In Phase 3, the researchers will cross-validate the simulation assessments and study the predictive validity of the classroom-embedded and benchmark assessments on the Life Science year-end test. For each validity study, the researchers will randomly select one class of approximately 25 students from each of 40 teachers, for a total of 1,000 students. The classes will be randomly assigned to use the full SimScientists suite (Sample 3) or to use only the year-end Life Science test (Sample 4).

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

ScienceData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote