Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Learning Progression-based and NGSS-aligned Formative Assessment for Using Mathematical Thinking in Science
Center: NCER Year: 2016
Principal Investigator: Jin, Hui Awardee: Educational Testing Service (ETS)
Program: Science, Technology, Engineering, and Mathematics (STEM) Education      [Program Details]
Award Period: 4 years (7/1/2016 – 6/30/2020) Award Amount: $1,396,496
Type: Measurement Award Number: R305A160219
Description:

Purpose: The goal of this four-year project was to develop and validate an NGSS-aligned and Learning Progression-based assessment tool for Mathematical Thinking in Science (MTS) at the high school level. MTS refers to abstracting relevant measurable variables from 'messy' phenomena, identifying mathematical relationships among the variables, and using scientific ideas to explain the mathematical relationships. Learning progressions (LPs) are "descriptions of the successively more sophisticated ways of thinking about a topic that can follow one another as children learn about and investigate a topic over a broad span of time" (National Research Council [NRC], 2007, p. 219). The assessment tool contains: 1) a mathematics component, where mathematics items and associated rubrics are used to measure students' baseline understanding in 'pure' mathematics; 2) a science component, where an MTS LP, associated items, and associated rubrics are used to measure students' MTS in two NGSS core ideas (energy in physical sciences and ecosystems in life sciences); 3) and a score reporting component that uses the automated scoring engine to generate real-time reports on students' performance in mathematics and MTS.

  • Project Activities: The assessment tool has a mathematics component, a science component, and a score reporting component. To develop the mathematics component, the researchers selected high-quality items and associated rubrics from a prior IES-funded project (R305A100518). To develop the science component, the researchers used an iterative process that included 1) a historical analysis on how mathematics was used in scientific development and revolution; 2) an interview study with 44 high school students, and 3) a field test with 5,353 high school students. To develop the score reporting component, the researchers used the field test data to build the automated scoring models and collaborated with science teachers in designing the score reports. The fully developed tool was then piloted in a classroom study, where 19 teachers used the tool to inform their teaching. Student and teacher data were collected to identify the affordances and limitations of the tool.

Key Outcomes: The main findings of this measurement project are as follows:

  • The researchers developed an MTS LP that contains four achievement levels, with each level describing a reasoning pattern that students use to solve science problems and explain real-world phenomena. Together, the four levels present a developmental trend, where students progress towards proficiency in quantification or mathematization (Jin, Delgado, et al., 2019).
  • The researchers developed a validation framework for science learning progressions. The framework was used to guide the validation activities in the project (Jin, van Rijn, et al., 2019).

Structured Abstract

Setting: The assessment tool was developed and evaluated in urban, suburban, and rural high schools located in 14 states in the United States.

Sample: In the project, 44 high-school students participated in an interview study; 5,353 high-school students participated in a field test; and 19 teachers and their students participated in a classroom pilot study. The participating students were diverse in their socioeconomic status and cultural background.

Assessment: The assessment tool contains a mathematics component, an MTS component, and a score reporting component. The mathematics component contains 24 items and associated scoring rubrics. It assesses students' mathematics understanding with the focus on two middle school concepts that are crucial for many science concepts in high school (linear functions and proportional reasoning). The science component contains an MTS LP, associated items, and associated scoring rubrics. It evaluates students' understanding of MTS in two science topics (energy in physical sciences and ecosystems in life sciences). The score reporting component provides real-time diagnostic information of students' MTS understanding and corresponding instructional suggestions.

Research Design and Methods: The assessment tool was developed and evaluated in an iterative process that included three phases. In Phase 1, the mathematics component was developed using the items and scoring rubrics from a prior IES-funded project (R305A100518). A hypothetical MTS LP was developed based on a historical analysis. In the analysis, the researchers examined how measurement and quantification enabled the generation of fundamental ideas in five events that had a significant contribution to the development and evolution of scientific knowledge. Based on this hypothetical LP, MTS tasks were designed and used with 44 high school students. The interview data were analyzed and used to revise the LP.

In Phase 2, a pool of 110 MTS items was generated based on the MTS LP and the interview results. The items were reviewed, revised, refined, and/or dropped based on usability interviews and feedback from the project advisors, science teachers, and assessment experts. This process resulted in 68 MTS items, which were used in a field test with 5,353 high-school students. Students' responses in the field test were scored by human raters. Next, quantitative analyses were applied to the scores and the analysis results were used to validate the LP and associated items. The score reporting component was designed to provide teachers with diagnostic information of student learning and instructional suggestions. The human scores and responses were also used to develop the automated scoring models, which were embedded in the tool to allow the generation of real-time score reports. 

In Phase 3, 19 teachers piloted the tool with their students. Student assessment data (pre- and post-assessment data), teacher teaching data (observation of four teachers' lessons), and teacher feedback (interview and survey data) were collected and analyzed to identify the affordances and limitations of the tool.

Control Condition: There is no control condition for this project.

Key Measures: Key measures included a student interview protocol, student assessments, a lesson observation protocol, a teacher interview protocol, and teacher survey.

Data Analytic Strategy: Advanced item response theory models were used to validate the MTS LP and to analyze the relationship between students' pure mathematics understanding and MTS understanding. The thematic analysis technique was used to analyze teacher interview data, survey data, and classroom observation data. The analysis results provide information about the affordances and limitations of the assessment tool.

Project website: http://ets-cls.org/mts/

Products and Publications

Journal article, monograph, or newsletter

Jin, H., Delgado, C., Bauer, M. I., Wylie, E. C., Cisterna, D., & Llort, K. F. (2019). A hypothetical learning progression for quantifying phenomena in science. Science & Education, 28(9), 1181–1208.

Jin, H., van Rijn, P., Moore, J. C., Bauer, M. I., Pressler, Y., & Yestness, N. (2019). A validation framework for science learning progression research. International Journal of Science Education, 41(10), 1324-1346. doi:10.1080/09500693.2019.1606471

Publicly available data

The project data are stored at ETS' Research Data Repository. Researchers can request project data by sending an email to rdrmailbox@ets.org.


Back