Skip Navigation

Postsecondary and Adult Education

Grantees

- OR -

Investigator

- OR -

Goals

- OR -

FY Awards

- OR -

The Efficacy of Personal Response Systems (Clickers) as Learning Tools: A Multidisciplinary, Large-Scale, Empirical Evaluation

Year: 2010
Name of Institution:
University of Massachusetts, Dartmouth
Goal: Efficacy and Replication
Principal Investigator:
Shapiro, Amy
Award Amount: $504,246
Award Period: 3 years
Award Number: R305A100625

Description:

Purpose: A Personal Response System (PRS, also known as "Clickers") is a widely used classroom technology that allows instructors to present multiple-choice questions in any classroom equipped with digital projection. Students purchase a remote response devise that allows them to "click in" responses, which are recorded by a receiver that is connected to a computer. PRS is widely used in colleges and universities but surprisingly few studies have been conducted on PRS-assisted learning outcomes. Previous studies have not systematically investigated the impact of PRS use on student outcomes or the mechanisms through which PRS may increase learning. This project will study the impact of PRS use on students' ability to answer both factual and conceptual exam questions. It explores the cognition underlying PRS to determine whether its effects are due to real cognitive change (i.e., strengthening or restructuring of memory) or to merely drawing students' attention and study efforts to important material.

Project activities: The performance of students in three introductory college courses (biology, physics, and psychology) over four semesters will be analyzed for final exam questions that were presented in different conditions in class. Instructors will develop course materials that include PRS questions to be presented throughout the course. The effects of PRS use on cognitive processing will be examined by comparing performance on exam questions that were presented as factual PRS questions, conceptual PRS questions, flagging questions, or simple control questions during instruction. Analyses will compare student performance for each of these conditions, and will study the impact of prior knowledge, academic ability, student reported study behaviors and other student characteristics on student outcomes.

Products: The products of this project will be published reports describing the impact of personal response systems use on student learning in introductory Biology, Physics and Psychology courses. Reports will provide information on the impact of PRS use on cognitive processing for students with different characteristics.

Structured Abstract

Setting: The study will take place at the University of Massachusetts at Dartmouth.

Population: The study participants include 1,440 students in introductory, college-level courses (280 students in Biology 101; 360 in Physics 114; and 800 in Psychology 202 across four semesters).

Intervention: Each course is a standard, large lecture format that uses some multimedia, interactive activities, and participation integrated into many of the lectures. PowerPoint presentations accompany the lectures. Instructors will use materials developed by the research team, in which PRS questions have been embedded in instructional PowerPoint presentations. Lecture materials will require students to respond using a personal response system for some material. The information needed to answer each of the 32 final exam questions will be presented in one of four conditions: PRS-Factual; PRS-Conceptual; Attention-Flagging Control; and Control. For PRS-Factual questions a PowerPoint slide containing a factual question about a lecture topic will be presented after the instructor has conveyed the information. Presentation of the PRS-Conceptual questions, which require students to apply information to a new situation or problem, will be accomplished in the same manner. For Attention-Flagging questions, the instructor will highlight information on the slide, but will not ask students to respond to a question using PRS. For control questions, no additional highlighting of the material will occur.

Research Design and Methods: This study uses a repeated-measures design in which each student participates in all four conditions in the study. In order to control for potential differences in the specific information targeted in the experimental conditions, each question will be rotated to serve in a different condition during the four semesters of the study. Through the repeated-measures design and the counterbalancing of assignment of items to different conditions across students, the study controls for item content and student differences. Fidelity of implementation will be assessed through classroom observation and self-reported records. Student characteristics like motivation, interest, and academic performance that may influence the impact of PRS use will also be studied.

Control Condition: Each student will serve as his or her own control.

Key Measures: The key measure is performance and memory for in-class material on a 7-point Likert scale. Performance on the exam questions assigned to each of the four conditions will also be measured.

Data Analytic Strategy: Analysis of variance will be used to study the impact of factual, conceptual, and flagging questions and their interactions on exam performance in each course and across courses. Performance differences on each question across conditions will also be studied using analysis of variance. Regression analyses will be conducted to determine the role of prior knowledge, academic ability, study behavior and other student variables in the relationship between PRS use and learning outcomes.

Products and Publications

Journal article, monograph, or newsletter

Shapiro, A. M., Sims-Knight, J., O'Rielly, G. V., Capaldo, P., Pedlow, T., Gordon, L., and Monteiro, K. (2017). Clickers can Promote Fact Retention but Impede Conceptual Understanding: The Effect of the Interaction Between Clicker Use and Pedagogy on Learning. Computers & Education, 111, 44–59.