Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Exploring the Mediators and Moderators of Metacomprehension Accuracy
Center: NCER Year: 2016
Principal Investigator: Wiley, Jennifer Awardee: University of Illinois, Chicago
Program: Postsecondary and Adult Education      [Program Details]
Award Period: 4 years (9/1/2016-8/31/2020) Award Amount: $1,394,684
Type: Exploration Award Number: R305A160008
Description:

Co-Principal Investigator(s): Griffin, Thomas

Purpose: In this project, the researchers explored postsecondary students' ability to monitor comprehension and learning processes and its connection to student outcomes. This ability, known as metacomprehension, helps learners to estimate how well they understand texts and to determine when they need to restudy them. Many students do not estimate their comprehension accurately and are unable to engage in effective self-regulated learning. Previous research has found that metacomprehension skills can be taught, but it is not clear whether all students benefit equally. This project explored (a) postsecondary students' metacomprehension skills in the context of a gateway course as they first transition to college and (b) the malleability of these skills, paying particular attention to students who fall below college-readiness standards.

Project Activities: Researchers conducted a series of exploratory studies using first-year students enrolled in introductory psychology classes. These studies used experimental designs to vary the type and amount of instruction students received in activities intended to support comprehension monitoring, to test for effects of these manipulations, and to test whether subgroups of students responded differently.

Key Outcomes: The main findings of this project are as follows:

  • The majority of students showed poor metacomprehension accuracy at the start of the course.  Students had lower average levels of metacomprehension accuracy in this authentic course context, which used readings on topics that are more similar and related to each other, than has generally been seen in laboratory experiments, which use more diverse sets of readings (Guerrero, Griffin, & Wiley, 2022).
  • Students who engaged in self-explanation activities as part of reading assignments had significant increases in metacomprehension accuracy from before to after instruction compared to students who engaged in example-generation or practice-testing activities. Even the less-prepared students were able to benefit from the explanation activities (Guerrero, Griffin, & Wiley, 2020).
  • Students showed no benefits from engaging in practice testing activities (Guerrero, Griffin, & Wiley, 2020) or example generation activities.
  • Students who engaged in prediction-generating activities saw relatively weak benefits in metacomprehension (Guerrero, Griffin, & Wiley, 2022).
  • Students showed improvement in their ability to engage in effective self-regulated learning on material later in the course after engaging in explanation-generation activities. However, this benefit appeared only when the explanation activities immediately preceded restudy opportunities.  When there was a delay between the explanation activities and restudy activities, then students did not show improved learning outcomes from the explanation manipulation (Wiley, Guerrero, Hildenbrand, & Griffin, 2022).
  • Less-prepared students scored as well on their course exams as other students after engaging in an augmented intervention. The augmented intervention included lessons on comprehension monitoring, on the structure of expository texts that convey theory and evidence, on the distinction between test questions that require inferences and test questions that require only memory for text, and on how to answer inference questions. These lessons were added to the basic intervention which consisted of explanation activities with practice tests and feedback. The augmented intervention eliminated most of the negative effect of being a less-prepared student on course exam scores (Griffin, Guerrero, Mielicki, & Wiley, 2020).

Structured Abstract

Setting: The research site was a large, urban Illinois university with a high rate of at-risk students (e.g., low-income, non-native English speaking, first-generation, and under-prepared students).

Sample: Approximately 1000 first-year undergraduate students participated in this study.

Factors: When studying, postsecondary students need to monitor whether they understand what they have read to determine when they need to restudy materials. This ability to monitor and regulate one's study (i.e., self-regulation) depends on metacomprehension skills (i.e., the ability to reflect on one's comprehension and understanding). Various instructional activities may help to support students' metacognition and self-regulation skills. In this project, the researchers explored different activities that may improve students' metacomprehension skills. The first activity was having students generate explanations while studying before retaking practice tests. The second activity was having students generate examples while studying before retaking the practice tests. The third activity was simply giving students a chance to reread and retake the practice tests. In an additional exploratory study, the researchers had students generate predictions to connect theories and evidence from the readings before retaking practice tests.

Research Design and Methods: Students completed baseline measures (reading and metacomprehension assessments) during the first 2 weeks of classes, followed by the manipulations and a post-test. Phase 1 tested the impact of explanation, example generation, and practice testing activities on metacomprehension accuracy. Phase 2 tested the effects of the explanation activity, which was the most beneficial in improving metacomprehension accuracy in Phase 1, on the ability to engage in effective self-regulated learning of content later in the course.

Control Condition: Students in the control conditions completed the same reading assignments and studied the same materials but did not receive additional activities or instruction how to use them to support comprehension monitoring.

Key Measures: The researchers used multiple-choice inference-based questions to determine students' comprehension of the reading assignments. They used students' self-reported judgments of comprehension to compute measures of metacomprehension accuracy. The researchers also collected student characteristic data, including measures of basic reading comprehension (e.g., ACT scores, vocabulary) and demographics (e.g., age, gender, race/ethnicity, socio-economic status, English learner status).

Data Analytic Strategy: Analyses were conducted using general linear models and linear mixed models approaches.

Products and Publications

Publicly Available Data: Datasets are being made publicly available as manuscripts are published. Interested individuals can contact the research team.

Project Website: https://jwiley.people.uic.edu/ies.html

ERIC Citations: Find available citations in ERIC for this award here.

Select Publications:

Book chapters

Griffin, T. D., Mielicki, M. K., & Wiley, J. (2019). Improving students' metacomprehension accuracy. In J. Dunlosky & K. Rawson (Eds.), Cambridge Handbook of Cognition and Education (pp. 619–646). New York, NY: Cambridge University Press.

Wiley, J., Jaeger, A. J., & Griffin, T. D. (2018). Effects of instructional conditions on comprehension from multiple sources in history and science. In J. L. G. Braasch, I. Bråten, & M. T. McCrudden (Eds.) Handbook of Multiple Source Use (pp. 341–361). Routledge. Full text

Wiley, J. & Guerrero, T. (2018). Prose comprehension beyond the page. In K. Millis, J. Magliano, D. Long, and K. Wiemer, (Eds.) Deep Comprehension: Multi-Disciplinary Approaches to Understanding, Enhancing, and Measuring Comprehension(pp. 3–15). New York, NY, Routledge.

Wolfe, M. & Griffin, T.D. (2017). Beliefs and discourse processing. In M. F. Schober, D. N. Rapp, and M. A. Britt (Eds.), Handbook of Discourse Processes, 2nd ed. (pp. 295–314). Taylor & Francis –Routledge. Full text

Journal articles

Griffin, T. D., Wiley, J., & Thiede, K. W. (2019). The effects of comprehension-test expectancies on metacomprehension accuracy. Journal of Experimental Psychology: Learning, Memory, and Cognition, 45(6), 1066–1092. Full text

Guerrero, T. A., Griffin, T. D., & Wiley, J. (2022). I think I was wrong: The effect of making experimental predictions on learning about theories from psychology textbook excerpts. Metacognition & Learning, 17(2), 337–373.

Guerrero, T. A., & Wiley, J. (2021). Expecting to teach affects learning during study of expository texts. Journal of Educational Psychology, 113(7), 1281–1303. Full text

Wiley, J., Jaeger, A. J., Taylor, A. R., & Griffin, T. D. (2018). When analogies harm: The effects of analogies on metacomprehension. Learning and Instruction, 55, 113–123. Full text

Proceedings

Griffin, T. D., Guerrero, T., Mielicki, M. K., & Wiley, J. (2020). Improving metacomprehension and exam grades of students at-risk for failure via explanation and inference-test instruction. Paper available in the Online Paper Repository for the 2020 American Educational Research Association (AERA) Annual Meeting. San Francisco, CA.

Guerrero, T., Griffin, T. D., & Wiley, J. (2020). Generating explanations is more helpful than practice testing alone for improving comprehension and metacomprehension. Paper available in the Online Paper Repository for the 2020 American Educational Research Association (AERA) Annual Meeting. San Francisco, CA.

Guerrero, T. A., Griffin, T. D., & Wiley, J. (2020). How do predictions change learning from science texts?(No. 3839). EasyChair. Full text

Guerrero, T. A. & Wiley, J. (2018). Effects of text availability and reasoning processes on test performance. In Proceedings of the 40th Annual Conference of the Cognitive Science Society(pp. 1745–1750). Madison, WI: Cognitive Science Society. Full text

Guerrero, T.A. & Wiley, J. (2019). Using "idealized peers" for automated evaluation of student understanding in an introductory psychology course. In Proceedings of the 20th International Conference on Artificial Intelligence in Education (pp. 133–143). Springer, Cham. Full text

Hildenbrand, L., & Wiley, J. (2021). Can closed-ended practice tests promote understanding from text? In Proceedings of the Annual Meeting of the Cognitive Science Society (Vol. 43, No. 43). Full text

Wiley, J., Guerrero, T.A., Hildenbrand, L., & Griffin, T. D. (2022). Exploring the boundaries: When explanation activities do not improve comprehension. Paper available in the Preprint Repository for the 32nd Annual Meeting of the Society for Text & Discourse (No. 9587). EasyChair.


Back