Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Learning Progressions in Middle Sch ...
Home arrow_forward_ios ... arrow_forward_ios Learning Progressions in Middle Sch ...
Information on IES-Funded Research
Grant Closed

Learning Progressions in Middle School Science Instruction and Assessment

NCER
Program: Education Research Grants
Program topic(s): Science, Technology, Engineering, and Mathematics (STEM) Education
Award amount: $1,599,931
Principal investigator: Mark Wilson
Awardee:
University of California, Berkeley
Year: 2010
Project type:
Measurement
Award number: R305A100692

Purpose

State adopted science standards are designed to describe the level of science proficiency expected of students from Kindergarten to Grade 12. However, the standards are often organized in discrete grade levels without attention to the developmental continuity across grades. Learning progressions, a new approach to thinking about how to structure science education, outline potential cognitive paths that students might experience as they develop a more sophisticated understanding of a core scientific concept. The purpose of this project is to develop assessments for learning progressions in physical science together with assessments of students’ scientific reasoning.

Project Activities

The researchers will develop and validate assessments for learning progressions in physical science focusing on the Structure of Matter, along with assessments of students’ scientific reasoning at Grade 8.   The assessments will be developed using the Berkeley Evaluation and Assessment Research (BEAR) System which is based on four principles: (1) a developmental perspective on student learning; (2) a match between instruction and assessment; (3) management by teachers; and (4) assessments that uphold standards of reliability and validity.

Structured Abstract

Setting

This study will be conducted in middle schools in California.

Sample

The study sample will consist of six to eight Grade 8 classrooms of approximately 35 students per class. The schools in the study are ethnically, culturally, and linguistically diverse.
Assessment
The assessments will focus on two learning progressions – Structure of Matter and Scientific Reasoning. The measures of progress variables to be developed, tested, and refined for the Structure of Matter include: (1) Properties of Objects; (2) Measurement and Data Handling; (3) Density, Mass and Volume; (4) Changes of State; (5) Macro Evidence for Particulate Structure; and (6) Atomic-molecular Theory of Macro Properties. The assessments will measure both students’ content knowledge related to Structure of Matter and students’ ability to reason scientifically.

Research design and methods

The assessments will undergo an iterative process of development and refinement. The assessments will be developed using the BEAR Assessment System. A key feature of the BEAR assessment system is that it is centered on progress variables—the “big ideas” around which a curriculum is structured.  It uses multiple independent measures to chronicle students’ learning progressions over time using the following Building Blocks: construct map, items design, outcome spaces, and the measurement model. The two learning progressions, Structure of Matter and Scientific Reasoning, are composed of several progress variables. For each progress variable, the researchers will cycle through the four Building Blocks multiple times to gather empirical evidence to improve the assessments and to establish evidence of reliability and validity.

Control condition

Due to the nature of the research design, there is no control condition.

Key measures

The key measures for the study include students’ scores on the embedded assessments, students’ end of year mathematics test scores on the California Standards Test, and teacher developed in-class tests.

Data analytic strategy

Validity of the assessments will be evaluated through unidimensional analyses for each progress variable (construct) using the Partial Credit Model to analyze the test items. Differential item functioning (DIF) analyses will also be conducted to investigate whether the test items function differently for particular subgroups of students. Reliability of the assessments will be evaluated by analyzing standard error of measurement values in relation to respondent location, internal consistency by using Cronbach’s alpha, and by the alternate forms reliability coefficient.

People and institutions involved

IES program contact(s)

Christina Chhin

Education Research Analyst
NCER

Products and publications

Products: The products of the project include embedded assessments that teachers can use to identify where students are in their understanding of particular science concepts (i.e., locate them on a learning progression) so that appropriate differentiated instruction can be provided. Peer reviewed publications will also be produced.

Book chapter

Black, P. (2015). The Role of Assessment in Pedagogy - and Why Validity Matters. In D. Wyse, L. Hayward, and J. Pandya (Eds.), The SAGE Handbook of Curriculum, Pedagogy and Assessment (pp. 725-739). London: Sage Publications.

Black, P., and Atkin, M. (2014). The Central Role of Assessment in Pedagogy. In N.G. Lederman, and S.K. Abell (Eds.), Handbook of Research on Science Education, Volume II (pp. 775-790). Abingdon, UK: Routledge.

Szu, E., and Osborne, J.F. (2011). Scientific Reasoning and Argumentation From a Bayesian Perspective. In M.S. Khine (Ed.), Perspectives on Scientific Argumentation (pp. 55-71). Dordrecht, Netherlands: Springer.

Journal article, monograph, or newsletter

Black, P. (2014). Assessment and the Aims of the Curriculum: An Explorer's Journey. Prospects: Quarterly Review of Comparative Education, 44 (4): 487-501.

Black, P. (2015). Formative Assessment - An Optimistic but Incomplete Vision. Assessment in Education: Principles, Policy and Practice, 22 (1): 161-177.

Black, P., and Wiliam, D. (2014). Assessment and the Design of Educational Materials. Educational Designer, 2 (7). Retrieved from http://www.educationaldesigner.org/. Full text

Henderson, J., MacPherson, A., Osborne, J., and Wild, A. (2015). Beyond Construction: Five Arguments for the Role and Value of Critique in Learning Science. International Journal of Science Education, 37 (10): 1668-1697.

Osborne, J. (2013). The 21st Century Challenge for Science Education: Assessing Scientific Reasoning. Thinking Skills and Creativity, 10 : 265-279.

Osborne, J. (2014). Teaching Critical Thinking? New Directions in Science Education. School Science Review, 95 (352): 53-62.

Osborne, J. F., Henderson, J.B., MacPherson, A., Szu, E., Wild, A.,, and Yao, S. (2016). ?The Development and Validation of a Learning Progression for Argumentation in Science. Journal of Research in Science Teaching, 53 (6): 821-846. d

Yao, S.Y., Wilson, M., Henderson, J.B., and Osborne J. (2015). Investigating the Function of Content and Argumentation Items in a Science Test: A Multidimensional Approach. Journal of Applied Measurement, 16 (2): 171-192.

Project website:

https://bearcenter.berkeley.edu/projects/LPS/

Supplemental information

Co-Principal Investigator: Jonathan Osbourne

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

ScienceCognitionData and Assessments

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Data Science Methods for Digital Learning Platform...

August 18, 2025
Read More
Zoomed in IES logo
Workshop/Training

Meta-Analysis Training Institute (MATI)

July 28, 2025
Read More
Zoomed in Yellow IES Logo
Workshop/Training

Bayesian Longitudinal Data Modeling in Education S...

July 21, 2025
Read More
icon-dot-govicon-https icon-quote