Project Activities
Structured Abstract
Setting
Sample
Research design and methods
Control condition
Key measures
Data analytic strategy
Key outcomes
People and institutions involved
IES program contact(s)
Products and publications
Book Chapters
Biancarosa, G. (2019). Measurement of reading comprehension processing and growth. In V. Grover, P. Uccelli, M. Rowe, & E. Lieven (Eds.), Learning through Language: Towards an Educationally Informed Theory of Language Learning (pp. 147-158). Cambridge, U.K.: Cambridge University Press.
Carlson, S. E., Seipel, B., Biancarosa, G., Davison, M. L., & Clinton, V. (2019). Demonstration of an innovative reading comprehension diagnostic tool. In M. Scheffel, J. Broisin, V. Pammer-Schindler, A. Ioannou, & J. Schneider (Eds.), Transforming Learning with Meaningful Technologies, (pp.769-772), Springer Nature Switzerland AG. https://doi.org/10.1007/978-3-030-29736-7_85
Seipel, B., Biancarosa, G., Carlson, S.E., and Davison, M.L. (2018). The Need, Use, and Future of Cognitive Diagnostic Assessments in Classroom Practice. In Handbook of Research on Program Development and Assessment Methodologies in K-20 Education (pp. 1-23). IGI Global.
Journal Publications
Biancarosa, G., Kennedy, P.C., Carlson, S.E., Yoon, H., Seipel, B., Liu, B., and Davison, M.L. (2019). Constructing Subscores That Add Validity: A Case Study of Identifying Students at Risk. Educational and Psychological Measurement, 79(1), 65-84. [Full Text]
Davison, M. L., Biancarosa, G., Carlson, S. E., Seipel, B., and Liu, B. (2018). Preliminary Findings on the Computer-Administered Multiple-Choice Online Causal Comprehension Assessment, A Diagnostic Reading Comprehension Test. Assessment for Effective Intervention, 43(3), 169-181. [Full Text]
Liu, B., Kennedy, P.C., Seipel, B., Carlson, S.E., Biancarosa, G., and Davison, M.L. (2019). Can We Learn From Student Mistakes in a Formative, Reading Comprehension Assessment?. Journal of Educational Measurement, 56(4), 815-835.
Su, S., and Davison, M.L. (2019). Improving the Predictive Validity of Reading Comprehension Using Response Times of Correct Item Responses. Applied Measurement in Education, 32(2), 166-182.
Technical Report
Davison, M. L., Biancarosa, G., Seipel, B., Carlson, S. E., Liu, B., & Kennedy, P. C. (2018). Technical Manual 2018: Multiple-Choice Online Comprehension Assessment (MOCCA) [MOCCA Technical Report MTR-2018-1].
Additional online resources and information:
- YouTube Channel Videos: https://www.youtube.com/channel/UCEUuwJq_QnGoNRTdtgfVHcg/featured
- Project Video: https://youtu.be/bm8ms9QVKUs (6 minutes)
Project website:
Related projects
Supplemental information
Co-Principal Investigators: Carlson, Sarah E.; Seipel, Ben; Davison, Mark L.
- MOCCA is a meaningful diagnostic assessment of reading comprehension that helps teachers understand not only whether students struggle with comprehension, but also the underlying reasons for their struggles (Biancarosa et al., 2019; Liu et al., 2019).
- MOCCA has been fully validated as a diagnostic assessment of reading comprehension with excellent reliability (Biancarosa et al., 2019; Davison et al., 2018; Su & Davison, 2019).
- MOCCA also has preliminary evidence of its validity as a screener for reading comprehension difficulties (Biancarosa et al., 2019).
- To access MOCCA, please contact mocca@uoregon.edu.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.