Matthew Easterday, Carnegie Mellon University, Human-Computer Interaction Institute
People have difficulty evaluating information on social policy--in fact, previous work shows that when presented with ambiguous causal information, people tend to interpret the information as confirming whatever belief they held prior to encountering the information. A second body of research shows that people can sometimes reason better when using diagrams or diagramming tools. In this experiment, we test the effectiveness of causal diagrams, and diagramming tools for teaching students to analyze the effects of different social policies.
Using a between-subjects, randomized, controlled experiment, approximately 100 Carnegie Mellon University undergraduates taking introductory social science courses will be given a short on-line training on analyzing social policies with multiple sources of evidence using either: i. diagrammatic representations, ii. diagramming tools, or iii. text only. Data from the on-line course will be logged and analyzed for the effect of diagrams on: i. students' ability to predict the effect of different policies, ii. students' confirmation bias, iii. students' avoidance of the "correlation equals causation" fallacy, and iv. students' credibility bias, i.e. the perception that a sources that confirm one's beliefs are more credible than sources that contradict one's beliefs.
Based on the results of the study (which will be conducted in late March), we will be able to make claims about whether or not students should be taught to read diagrams or use diagramming tools in order to understand social policy, a fundamental skill for an informed citizen.