WWC review of this study

Classroom response systems facilitate student accountability, readiness, and learning.

Jones, S. J., Crandall, J., Vogler, J. S., & Robinson, D. H. (2013). Journal of Educational Computing Research 49(2), 155-171.

  •  examining 
    88
     Students
    , grade
    PS

Reviewed: July 2020

No statistically significant positive
findings
Meets WWC standards without reservations
Academic achievement outcomes—Substantively important positive effect found for the domain
Outcome
measure
Comparison Period Sample Intervention
mean
Comparison
mean
Significant? Improvement
    index
Evidence
tier

Unit Exam: Experiment 3

Classroom Response System (CRS) vs. Business as usual

0 Days

Full sample: Experiments 2 and 3;
39 students

24.41

21.39

No

--

Unit Exam: Experiment 2

Classroom Response System (CRS) vs. Business as usual

0 Days

Full sample: Experiment 2 and 3;
39 students

22.16

19.61

No

--

Unit Exam: Experiment 1

Classroom Response System (CRS) vs. Business as usual

0 Days

Full sample: Experiment 1 (iClicker CRS);
49 students

24.42

23.96

No

--


Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.

Characteristics of study sample as reported by study author.


  • Female: 83%
    Male: 17%
    • B
    • A
    • C
    • D
    • E
    • F
    • G
    • I
    • H
    • J
    • K
    • L
    • P
    • M
    • N
    • O
    • Q
    • R
    • S
    • V
    • U
    • T
    • W
    • X
    • Z
    • Y
    • a
    • h
    • i
    • b
    • d
    • e
    • f
    • c
    • g
    • j
    • k
    • l
    • m
    • n
    • o
    • p
    • q
    • r
    • s
    • t
    • u
    • x
    • w
    • y

    South

Setting

The study was conducted at a large, south-central public university; students were undergraduates enrolled in an educational psychology course. Experiment 1 took place during the Fall 2009 semester, while Experiments 2 and 3 took place during the Spring 2010 semester.

Study sample

For Experiment 1, 80 percent of the randomized sample was female. For Experiments 2 and 3, 87 percent of the randomized sample was female. The study does not provide further information on student demographics or characteristics of the analytic samples.

Intervention Group

[Classroom Response Systems] For Experiment 1, the iClicker CRS was used, which consisted of two 75-minute lectures that included a total of eight multiple-choice questions. For Experiment 2, iClicker CRS was used along with a Mobile Ongoing Course Assessment (MOCA) response system. For Experiment 3, the MOCA response system was used to allow students to answer questions outside of class time.

Comparison Group

[Business as usual] For Experiment 1, students in the comparison condition could see the same multiple-choice questions during Unit 4 lectures as students in the intervention condition but did not have access to CRS to answer the questions and could not earn bonus points for Unit 4. For Experiment 2, students in the comparison condition were told, prior to Unit 2, that they would not be answering in-class questions until Unit 3. but were also told that when they arrived for the first Unit 2 lecture, they could use the CRS to participate in the in-class questions but would not receive points until the following unit. For Experiment 3, students in the comparison condition could view, using the MOCA CRS, the same 10 pre-lecture questions before each of the two unit 4 lectures as students in the intervention condition, but only students in the intervention condition received points for correct responses.

Support for implementation

The iClicker CRS and MOCA response systems are tools/devices clearly identified as being used in the intervention group. There is no information about support for implementation for course instructors.

 

Your export should download shortly as a zip archive.

This download will include data files for study and findings review data and a data dictionary.

Connect With the WWC

loading