Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Navigating Scientific and Statistic ...
Home arrow_forward_ios ... arrow_forward_ios Navigating Scientific and Statistic ...
Information on IES-Funded Research
Grant Closed

Navigating Scientific and Statistical Reasoning in Adolescence

NCER
Program: Education Research Grants
Program topic(s): Cognition and Student Learning
Award amount: $1,399,758
Principal investigator: Priti Shah
Awardee:
University of Michigan
Year: 2017
Award period: 5 years 11 months (07/01/2017 - 06/30/2023)
Project type:
Exploration
Award number: R305A170489

Purpose

The purpose of this project is to identify promising strategies for teaching middle and high school students to critically evaluate scientific evidence presented in everyday contexts, along with new knowledge about the factors that affect the development of robust scientific evaluation skills. Individuals must learn to distinguish between good science, bad science, and pseudoscience so that they can make well-informed decisions about health, behavior, and policy. A major goal of K-12 science education is to teach students inquiry skills necessary to critically evaluate scientific evidence; however, the inquiry skills learned in the science classroom do not transfer well to everyday scientific reasoning contexts. The outcome of this project will be the identification of promising instructional strategies for improving everyday scientific reasoning for transfer to a variety of contexts.

Project Activities

Six studies will systematically identify effective instructional methods for learning evaluation skills about scientific evidence. Each study compares one or more instructional factor(s) to a control condition that is similar in all respects. In Years 1 and 2, the research team will conduct three studies focusing on causality bias as a key threat to validity, and will seek to identify effective training examples for critically assessing claims about correlational data. In Years 3 and 4, the research team will conduct three studies that combine the most effective factors the first three studies. In addition, two of the studies conducted in Years 3 and 4 will incrementally incorporate the other two target threats to validity (selection bias and overgeneralizing error).

Structured Abstract

Setting

Participating schools are located in urban and suburban areas of Michigan.

Sample

Approximately 480 seventh- through ninth-grade students will participate in the first four studies (120 in each study). 90 seventh- through ninth-grade students will participate in the fifth study. 60 seventh- through ninth-grade students will participate in the sixth study.

Intervention

Due to the exploratory nature of this project, there is no intervention. The research team will identify instructional methods for learning evaluation skills about scientific evidence. This work will inform the development of interventions designed to improve students' ability to evaluate scientific evidence.

Research design and methods

Six studies will systematically identify effective instructional methods for learning evaluation skills about scientific evidence. Each study compares one or more instructional factor(s) to a control condition that is similar in all respects. In Years 1 and 2, the research team will conduct three studies focusing on causality bias as a key threat to validity, and will seek to identify effective training examples for critically assessing claims about correlational data. In Years 3 and 4, the research team will conduct three studies that combine the most effective factors the first three studies. In addition, two of the studies conducted in Years 3 and 4 will incrementally incorporate the other two target threats to validity (selection bias and overgeneralizing error). In all studies, participants complete a pre-test, then will learn abstract rules for reasoning about threats to validity and evaluate tutorial ‘media reports' containing scientific findings, and finally will complete a post-test. In the final two studies, students will complete a delayed post-test three months after receiving instruction.

Control condition

Each study compares one or more instructional factor(s) with a control condition similar in all respects. The control condition differs based on the research question being asked.

Key measures

The key measure is the change in students' pretest and posttest performance on a researcher developed task where students must evaluate scientific evidence in fictional media articles.

Data analytic strategy

Researchers will use analysis of variance and general linear model methods to compare pre- to post-test changes in performance on evidence evaluation tasks for varied instructional and control conditions.

People and institutions involved

IES program contact(s)

Erin Higgins

Education Research Analyst
NCER

Products and publications

Researchers will produce preliminary evidence of potentially promising instructional methods for teaching students to evaluate scientific evidence in a variety of contexts as well as peer-reviewed publications.

Publications:

ERIC Citations: Find available citations in ERIC for this award here.

Cao, Y., Subramonyam, H., & Adar, E. (2022, March). VideoSticker: A tool for active viewing and visual note-taking from videos. In Proceedings of the 27th International Conference on Intelligent User Interfaces (pp. 672-690).

Fansher, M., Adkins, T. J., & Shah, P. (2022). Graphs do not lead people to infer causation from correlation. Journal of Experimental Psychology: Applied, 28(2), 314.

Franconeri, S. L., Padilla, L. M., Shah, P., Zacks, J. M., & Hullman, J. (2021). The science of visual data communication: What works. Psychological Science in the public interest, 22(3), 110-161.

Michal, A. L., & Shah, P. (2024). A Practical Significance Bias in Laypeople’s Evaluation of Scientific Findings. Psychological Science, 35(4), 315-327.

Michal, A. L., Zhong, Y., & Shah, P. (2021). When and why do people act on flawed science? Effects of anecdotes and prior beliefs on evidence-based decision-making. Cognitive Research: Principles and Implications, 6(1), 28.

Nancekivell, S. E., Sun, X., Gelman, S. A., & Shah, P. (2021). A slippery myth: How learning style beliefs shape reasoning about multimodal instruction and related scientific evidence. Cognitive Science, 45(10), e13047.

Subramonyam, H., Seifert, C., Shah, P., & Adar, E. (2020, April). texSketch: Active diagramming through pen-and-ink annotations. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13).

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

CognitionK-12 EducationMathematicsScienceStudents

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in Yellow IES Logo
Grant

Longitudinal Relations Among Social Contexts, Bull...

Award number: R305A230406
Read More
Zoomed in IES logo
Request for Applications

Education Research and Development Center Program ...

March 14, 2025
Read More
Zoomed in IES logo
Data file

2022‒23 Common Core of Data (CCD) Dropouts Public-...

Data owner(s): Chen-Su Chen
Publication number: NCES 2024253
Read More
icon-dot-govicon-https icon-quote