
The effects of metacognitive training versus worked-out examples on students’ mathematical reasoning.
Mevarech, Z. R., & Kramarski, B. (2003). British Journal of Educational Psychology, 73(4), 449–471.
-
examining122Students, grade8
Practice Guide
Review Details
Reviewed: December 2022
- Practice Guide (findings for Metacognitive training —Mevarech & Kramarski (2003))
- Randomized Controlled Trial
- Meets WWC standards without reservations because it is a randomized controlled trial with low attrition.
This review may not reflect the full body of research evidence for this intervention.
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Findings
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Algebraic Representation Post-test |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
0 Days |
Full sample;
|
8.43 |
5.62 |
No |
-- | ||
Show Supplemental Findings | |||||||||
Algebraic Representation Post-test |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
1 Year |
Full sample;
|
6.83 |
6.47 |
No |
-- |
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Algebraic Solution Post-test |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
0 Days |
Full sample;
|
8.81 |
5.72 |
No |
-- | ||
Procedural Knowledge (Total Score; Author-Created) |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
0 Days |
Full sample;
|
79.19 |
72.22 |
No |
-- | ||
Show Supplemental Findings | |||||||||
Procedural Knowledge (Total Score; Author-Created) |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
1 Year |
Full sample;
|
76.83 |
68.51 |
No |
-- | ||
Algebraic Solution Post-test |
Metacognitive training —Mevarech & Kramarski (2003) vs. Intervention |
1 Year |
Full sample;
|
10.20 |
9.73 |
No |
-- |
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Sample Characteristics
Characteristics of study sample as reported by study author.
-
- B
- A
- C
- D
- E
- F
- G
- I
- H
- J
- K
- L
- P
- M
- N
- O
- Q
- R
- S
- V
- U
- T
- W
- X
- Z
- Y
- a
- h
- i
- b
- d
- e
- f
- c
- g
- j
- k
- l
- m
- n
- o
- p
- q
- r
- s
- t
- u
- v
- x
- w
- y
International
Study Details
Setting
The study takes place in five classrooms in Israel. Israel is an acceptable study location according to the High School Algebra Review Protocol since the study is published in English.
Study sample
The study reported limited information on sample characteristics. The students were Israeli and the mean age of students in the sample was 14.12 years old.
Intervention Group
The intervention, metacognitive training or MT, is a replicable instructional practice. Students were guided to activate metacognitive questioning as developed by IMPROVE (Mevarech & Kramarski, 1997). The metacognitive questions included comprehension questions, connection questions, strategic questions and reflection questions. Students formulated and answered the metacognitive questions while they worked on problems in small groups. Each new type of problem was followed by a series of four practicing problems that required a solution through metacognitive questioning. Each student in turn read a problem aloud and tried to solve the problem by using the metacognitive questioning procedure. When a student failed to solve the problem, or when there was a disagreement between solvers, students discussed the solution until a consensus was achieved. Students asked the teacher for help only when they failed to reach an agreement. The IMPROVE method emphasizes reflective discourse by providing each student with the opportunity to be involved in mathematical reasoning via the use of self-addressed metacognitive questions in small groups: comprehension questions, connection questions, strategic questions and reflection questions. The comprehension questions were designed to prompt students to reflect on the problem/task before solving it. In addressing a comprehension question, students had to read the problem/task aloud, describe the task in their own words, and try to understand what the task/concepts means. The comprehension questions included questions such as: ‘What is the problem/task all about?’; ‘What is the question?’; ‘What are the meanings of the mathematical concepts?’ The connection questions were designed to prompt students to focus on similarities and differences between the problem/task they work on and the problem/task or set of problems/tasks that they had already solved. For example: ‘How is this problem/task different from/similar to what you have already solved? Explain why’. The strategic questions were designed to prompt students to consider which strategies are appropriate for solving the given problem/task and for what reasons. In addressing the strategic questions, students had to describe the what (e.g., ‘What strategy/tactic/principle can be used in order to solve the problem/task?’) the why (e.g., ‘Why is this strategy/tactic/principle most appropriate for solving the problem/task?’) and how (e.g., ‘How can I organize the information to solve the problem/task’; and ‘How can the suggested plan be carried out?’). The reflection questions were designed to prompt students to reflect on their understanding and feelings during the solution process (e.g., ‘What am I doing?’; ‘Does it make sense?’; ‘What difficulties/feelings I face in solving the task?’; ‘How can I verify the solution?’; ‘Can I use another approach for solving the task?’).
Comparison Group
The comparison condition involved students using worked examples (WE). A worked example specified steps in the solution process and provided written explanations if needed. Students studied the example and then solved four practice problems using the same methods as used in the worked example. Students worked in small groups of four and each student took a turn solving a problem out loud.
Improving Mathematical Problem Solving in Grades 4 Through 8
Review Details
Reviewed: May 2012
- Randomized controlled trial
- Meets WWC standards without reservations
This review may not reflect the full body of research evidence for this intervention.
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Findings
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Sample Characteristics
Characteristics of study sample as reported by study author.
-
Race White 100% -
Ethnicity Not Hispanic or Latino 100%
An indicator of the effect of the intervention, the improvement index can be interpreted as the expected change in percentile rank for an average comparison group student if that student had received the intervention.
For more, please see the WWC Glossary entry for improvement index.
An outcome is the knowledge, skills, and attitudes that are attained as a result of an activity. An outcome measures is an instrument, device, or method that provides data on the outcome.
A finding that is included in the effectiveness rating. Excluded findings may include subgroups and subscales.
The sample on which the analysis was conducted.
The group to which the intervention group is compared, which may include a different intervention, business as usual, or no services.
The timing of the post-intervention outcome measure.
The number of students included in the analysis.
The mean score of students in the intervention group.
The mean score of students in the comparison group.
The WWC considers a finding to be statistically significant if the likelihood that the finding is due to chance alone, rather than a real difference, is less than five percent.
The WWC reviews studies for WWC products, Department of Education grant competitions, and IES performance measures.
The name and version of the document used to guide the review of the study.
The version of the WWC design standards used to guide the review of the study.
The result of the WWC assessment of the study. The rating is based on the strength of evidence of the effectiveness of the intervention. Studies are given a rating of Meets WWC Design Standards without Reservations, Meets WWC Design Standards with Reservations, or >Does Not Meet WWC Design Standards.
A related publication that was reviewed alongside the main study of interest.
Study findings for this report.
Based on the direction, magnitude, and statistical significance of the findings within a domain, the WWC characterizes the findings from a study as one of the following: statistically significant positive effects, substantively important positive effects, indeterminate effects, substantively important negative effects, and statistically significant negative effects. For more, please see the WWC Handbook.
The WWC may review studies for multiple purposes, including different reports and re-reviews using updated standards. Each WWC review of this study is listed in the dropdown. Details on any review may be accessed by making a selection from the drop down list.
Tier 1 Strong indicates strong evidence of effectiveness,
Tier 2 Moderate indicates moderate evidence of effectiveness, and
Tier 3 Promising indicates promising evidence of effectiveness,
as defined in the
non-regulatory guidance for ESSA
and the regulations for ED discretionary grants (EDGAR Part 77).