The study took place in middle school classrooms in the United States. The teachers were using the ASSISTments software as part of their regular classroom instruction. No other information about the study setting was reported.
A total of 246 students in grades 5, 6, and 7 were included in the study. The 246 middle school students were taught by six teachers. The number of schools included in the study was not reported.
The study consisted of two experiments. In the first experiment, 103 middle school students were taught by one grade 5 teacher and one grade 6 teacher. Students were randomly assigned either to a no-feedback comparison group (58 students) or to a correct-answer feedback intervention group (45 students). In the second experiment, 143 middle schools students were taught by two grade 6 teachers and two grade 7 teachers. Students were randomly assigned to a no-feedback comparison group (25 students) or to one of three intervention groups: correct-answer feedback (44 students), explanation feedback (41 students), or try-again feedback (33 students).
The study did not report any sample characteristics for the first experiment. In the second experiment, approximately 47% of students were female. The study did not report information about free or reduced-price lunch eligibility, English learner status, disabilities, race, or ethnicity for the second experiment.
The intervention is the practice of providing individual students feedback using the ASSISTments software. In each experiment, students took a pretest, completed one algebra homework assignment within three days of the pretest, and then took a posttest on the same day that they completed the homework assignment, all using ASSISTments. The homework assignment included two worked examples followed by problems that students had to solve on their own. Students received immediate, computer-generated feedback for each homework problem completed.
In the first experiment, students in the intervention group received correct-answer feedback. If the student solved the problem correctly, the ASSISTments software displayed a green checkmark with the word “Correct!” If the student solved the problem incorrectly, the software displayed a red “X” and provided the right answer.
The second experiment had three intervention groups, defined by the type of feedback received: correct-answer, explanation, or try-again. Students in the correct-answer group received the same feedback as in the first experiment. The explanation and try-again feedback groups also received a green checkmark with the word “Correct!” after a correct answer but the feedback for an incorrect answer differed. After a wrong answer, the ASSISTments software provided students in the explanation feedback group with the correct answer, an explanation for why the answer was correct, and a worked example that showed how to solve the problem. In the try-again feedback group, the ASSISTments software relayed a message after an incorrect answer that stated, “Sorry, try again. [Student’s answer] is not correct.” Students in this try-again group could then submit other responses until they obtained the correct answer, or they could click a button to obtain the correct answer.
In both experiments, students in the no-feedback comparison group completed their algebra homework assignment using the ASSISTments software but were not informed whether their answers were correct or incorrect. After students submitted an answer, the software provided an “answer recorded” message and students clicked a button to progress to the next question.
Support for implementation
The study did not describe any support for implementation.