Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios Multiple-choice Online Cloze Compre ...
Home arrow_forward_ios ... arrow_forward_ios Multiple-choice Online Cloze Compre ...
Information on IES-Funded Research
Grant Closed

Multiple-choice Online Cloze Comprehension Assessment (MOCCA): Refining and Validating a Measure of Individual Differences in Reading Comprehension Processes During Reading

NCER
Program: Education Research Grants
Program topic(s): Literacy
Award amount: $1,599,776
Principal investigator: Gina Biancarosa
Awardee:
University of Oregon
Year: 2014
Project type:
Measurement
Award number: R305A140185

Purpose

Numerous children struggle with reading comprehension and previous research shows that a subset of these children struggle with higher level reading skills, such as maintaining causal coherence while reading. Causal coherence is created through inferences that require the reader to synthesize why an event occurs based on relevant goals previously identified in the text and to generate missing information from background knowledge. Students who struggle with causal coherence typically fall into one of two types and tend to respond differently to intervention depending on their type. The first type tends to paraphrase what they have already read. The second type tends to make lateral connections, which are inferences or personal associations that are not causally related to the text. However, without an assessment to identify between these types of poor comprehenders, teachers may be unable to provide the appropriate intervention for each student. The Multiple-choice Online Cloze Comprehension Assessment (MOCCA) is a paper-and-pencil assessment to diagnose specific types of poor comprehension in third through fifth grade students. Each multiple-choice question has a correct answer, an answer that indicates the student has paraphrased, and an answer that indicates a lateral connection. In this project, the researchers refined, expanded, and validated MOCCA, creating multiple forms for each grade level and converting the test to an online platform.

Project Activities

The project team developed an initial pool of 150 items for each grade. Teachers provided content validity for the items. Items were pilot tested in the first year and field tested in two large-scale tests in Years 2 and 3. Additionally, the team began work on moving the administration of MOCCA from paper-and-pencil to a computer-based assessment. Third- through fifth-grade students completed think alouds in each year in order to provide information to the researchers regarding how the items are functioning.

Structured Abstract

Setting

This study took place in school districts in Oregon and California. Additionally, a nationally representative sample of students from around the U.S. participated in the field trials.

Sample

Participants included 9,259 third- through fifth-grade students across 4 years of the project.
Assessment
The researchers refined and validated the Multiple-choice Online Cloze Comprehension Assessment (MOCCA), an assessment designed to diagnose specific types of poor comprehension in third- through fifth-grade students. MOCCA is computer based and includes three forms per grade. Each item on the assessment is a seven-sentence narrative with the sixth sentence deleted from the narrative. The missing sixth sentence resolves the goal of the story. Students are asked to choose the best sentence from among four choices to insert as the sixth sentence. One of the answers will be correct (namely, itis causally coherent with the narrative), one of the answers will be a paraphrase, and one of the answers will be a lateral connection (namely, it is an inference or personal association which is not causally coherent with the text). Each form includes 40 items.

Research design and methods

In order to create additional forms of the assessment for each grade level, the research team developed an initial pool of 150 items for each grade level. Teachers provided content validity feedback. In Year 1, the MOCCA items were piloted with students in third through fifth grade. A subsample of students  complete think alouds to provide information regarding the functioning of the items. In Year 2, following revisions to the items, students participated in the first MOCCA field test, with additional collection of think aloud data from a subsample of students. Following more revisions, a second field test with accompanying think alouds was conducted in Year 3. The researchers also assessed MOCCA's content, diagnostic, criterion, and construct validity and fairness.

Control condition

Due to the nature of this research design, there was no control condition.

Key measures

The researchers established criterion validity by correlating the MOCCA with school-administered measures of reading including easyCBM Vocabulary, the Dynamic Indicators of Basic Early Literacy Skills (DIBELS), and the state's standardized reading test (for example, SmarterBalanced or Standardized Testing and Reporting (STAR) in California). The established construct validity by comparing the MOCCA with the Bridge-IT Picture Version.

Data analytic strategy

During the refinement stage of the project, the researchers used data from the think alouds to inform revisions of the incorrect answer choices and to identify any additional profiles of poor comprehenders. They analyzed data using two item response theory (IRT) analyses. The first analysis was a multidimensional IRT based on a full decision choice model, which produces three scores for each person: a causal coherence score, a paraphrase score, and a lateral connection score. The second analysis was a unidimensional two-parameter logistic (2PL) model. They also used correlations, receiver operating curve analyses, and logistic regression to establish validity.

Key outcomes

The main features of the assessment and outcomes of the validation study are as follows:

People and institutions involved

IES program contact(s)

Elizabeth Albro

Elizabeth Albro

Commissioner of Education Research
NCER

Products and publications

Book Chapters

Biancarosa, G. (2019). Measurement of reading comprehension processing and growth. In V. Grover, P. Uccelli, M. Rowe, & E. Lieven (Eds.), Learning through Language: Towards an Educationally Informed Theory of Language Learning (pp. 147-158). Cambridge, U.K.: Cambridge University Press.

Carlson, S. E., Seipel, B., Biancarosa, G., Davison, M. L., & Clinton, V. (2019). Demonstration of an innovative reading comprehension diagnostic tool. In M. Scheffel, J. Broisin, V. Pammer-Schindler, A. Ioannou, & J. Schneider (Eds.), Transforming Learning with Meaningful Technologies, (pp.769-772), Springer Nature Switzerland AG. https://doi.org/10.1007/978-3-030-29736-7_85

Seipel, B., Biancarosa, G., Carlson, S.E., and Davison, M.L. (2018). The Need, Use, and Future of Cognitive Diagnostic Assessments in Classroom Practice. In Handbook of Research on Program Development and Assessment Methodologies in K-20 Education (pp. 1-23). IGI Global.

Journal Publications

Biancarosa, G., Kennedy, P.C., Carlson, S.E., Yoon, H., Seipel, B., Liu, B., and Davison, M.L. (2019). Constructing Subscores That Add Validity: A Case Study of Identifying Students at Risk. Educational and Psychological Measurement, 79(1), 65-84. [Full Text]

Davison, M. L., Biancarosa, G., Carlson, S. E., Seipel, B., and Liu, B. (2018). Preliminary Findings on the Computer-Administered Multiple-Choice Online Causal Comprehension Assessment, A Diagnostic Reading Comprehension Test. Assessment for Effective Intervention, 43(3), 169-181. [Full Text]

Liu, B., Kennedy, P.C., Seipel, B., Carlson, S.E., Biancarosa, G., and Davison, M.L. (2019). Can We Learn From Student Mistakes in a Formative, Reading Comprehension Assessment?. Journal of Educational Measurement, 56(4), 815-835.

Su, S., and Davison, M.L. (2019). Improving the Predictive Validity of Reading Comprehension Using Response Times of Correct Item Responses. Applied Measurement in Education, 32(2), 166-182.

Technical Report

Davison, M. L., Biancarosa, G., Seipel, B., Carlson, S. E., Liu, B., & Kennedy, P. C. (2018). Technical Manual 2018: Multiple-Choice Online Comprehension Assessment (MOCCA) [MOCCA Technical Report MTR-2018-1].

Additional online resources and information:

  • YouTube Channel Videos: https://www.youtube.com/channel/UCEUuwJq_QnGoNRTdtgfVHcg/featured
  • Project Video: https://youtu.be/bm8ms9QVKUs (6 minutes)

Project website:

https://mocca.uoregon.edu/

Related projects

Multiple-choice Online Causal Comprehension Assessment for Postsecondary Students (MOCCA-College): Measuring Individual Differences in Reading Comprehension Ability of Struggling College Readers by Text Type

R305A180417

Multiple-choice Online Causal Comprehension Assessment Refinement: Achieving Better Discrimination via Experimental Item Types and Adaptive Testing

R305A190393

Supplemental information

Co-Principal Investigators: Carlson, Sarah E.; Seipel, Ben; Davison, Mark L.

  • MOCCA is a meaningful diagnostic assessment of reading comprehension that helps teachers understand not only whether students struggle with comprehension, but also the underlying reasons for their struggles (Biancarosa et al., 2019; Liu et al., 2019).
  • MOCCA has been fully validated as a diagnostic assessment of reading comprehension with excellent reliability (Biancarosa et al., 2019; Davison et al., 2018; Su & Davison, 2019).
  • MOCCA also has preliminary evidence of its validity as a screener for reading comprehension difficulties (Biancarosa et al., 2019).
  • To access MOCCA, please contact mocca@uoregon.edu.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Cognition

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Grant

Effects of Enhanced Representations in Digital Mat...

Award number: R305N240050
Read More
Zoomed in IES logo
Grant

A Multipronged Approach to Small-Teaching Interven...

Award number: R305N240063
Read More
Zoomed in IES logo
Grant

Additive and Multiplicative Learning Progressions:...

Award number: R305A240131
Read More
icon-dot-govicon-https icon-quote