|Title:||Assessing Reading for Understanding: A Theory-based, Developmental Approach|
|Principal Investigator:||Sabatini, John||Awardee:||Educational Testing Service (ETS)|
|Program:||Reading for Understanding Research Initiative [Program Details]|
|Award Period:||5 years||Award Amount:||$14,824,226|
|Type:||Multiple Goals||Award Number:||R305F100005|
Co-Principal Investigator: O'Reilly, Tenaha
Purpose: This team developed and evaluated a new system of assessments aligned with current theoretical constructs and empirical findings pertaining to both reading comprehension and performance moderators; sensitive to changes in development in reading comprehension; emphasize strategic reading processes empirically supported in the literature; provide greater information for guiding instruction (especially for students struggling to reach proficiency); and are comprised of texts and tasks that represent a range of purposeful literacy activities in which 21st century students are expected to read texts for understanding. Two types of assessments were developed. The first type of assessment, called GISA (Global, Integrated Scenario-Based Assessment), is a purpose-driven, scenario-based assessment that requires students to synthesize and evaluate information from multiple sources in order to solve problems, make decisions, or produce products to attain a defined goal The second type of assessment is a set of component skill measures designed to provide additional information about non-proficient readers to help identify or rule out potential bases for comprehension difficulties and predict developmental trajectories. The component measures assess proficiency in print skills, vocabulary, and oral language as they relate to growth in reading proficiency. Formative assessments (including progress monitoring and diagnostic measures) were developed and evaluated to determine how well they predict performance on GISA as well as indicating a reader's areas of strengths and weaknesses. Project partners included researchers at Educational Testing Service (ETS), Florida State University/Florida Center for Reading Research (FSU/FCRR), Arizona State University (ASU), and Northern Illinois University (NIU). ETS collaborated with schools and district personnel to inform the designs and provide feedback on the feasibility and utility of the assessments. Collaborators included superintendents, principals, school or district assessment specialists, English learner specialists, and reading teacher and/or curriculum specialists.
Grade Span: Prekindergarten through Grade 12
School Partners: Orange County and Lake County Schools (Florida), Brockton Public Schools (Massachusetts), Trenton Public and Catholic Schools (New Jersey), Baltimore City Public Schools (Maryland), as well as various schools across the country.
Description: The conceptual framework for the assessments is based on principles derived from contemporary models of reading, reading acquisition, cognitive science, and learning. The principles highlight the important roles of print skills, oral language, vocabulary, model building, background knowledge, motivation, purpose-driven comprehension, self-regulation/metacognition, and learning in digital and print environments. A summary of the principles is as follows: (a) print skills and linguistic comprehension are both necessary, but neither are sufficient for reading for understanding; (b) oral language continues to be a reciprocal predictor and influence on print comprehension; (c) breadth and depth of vocabulary (and word learning) are critically related to both reading for understanding and growth in world knowledge; (d) readers model/represent understanding of texts at multiple levels from simple, literal meanings to complex interpretations; (e) reading is typically a purposeful activity that must be regulated via metacognitive and strategic processes; (f) skilled readers integrate information across multiple texts; (g) fluency across an increasing array of information-communication technologies and print genres is characteristic of reading proficiency; and (h) the development of proficiency is marked by a continual increase in sophistication of skills in understanding increasingly complex texts as demanded by school and society.
The research team works closely with educators to maximize the practical utility of the assessments for the classroom teacher and literacy professionals through attention to assessment design, details of implementation, and score reporting formats.
Development of the assessments was based on this conceptual framework and was organized around the following studies:
Framework: An assessment framework based on the conceptual model was developed to outline the purpose of the assessment, the constructs measured, and the types of tasks to guide development.
Developmental issues relating component skills to reading comprehension: A number of existing tests of decoding/word recognition, vocabulary, and oral comprehension/language were investigated to identify the component skills that will be targeted for new assessments of component skills. Activities focused on identifying which component skills are best suited for use in GISA and how they interrelate as students develop comprehension skills. A two-year longitudinal design was complete to study a subset of students from prekindergarten through grade 12.
Iterative studies of GISA prototypes: A series of studies investigated how well GISA operationalizes the comprehension constructs as defined by the assessment framework. These studies investigated how and whether different assessment approaches and item/task designs capture the intended variance related to comprehension constructs, as well as issues of response format, scoring models, feasibility and utility. Special efforts were made to include a range of students from a variety of locales and regions across the country, including English language learners and students with special needs, to address the adequacy of the assessments for diverse populations. Performance moderators, such as motivation/engagement, background knowledge and self-regulation/metacognition, were also studied by isolating their effects in experimental studies.
Large-scale field study to evaluate the GISA and component skills tests: Studies collected evidence of validity in relation to expected structural relationships among the measures, stability and change of scores over time, external validity, and generalizability of the assessment results.
Evaluation of formative assessments as predictors of growth and risk: Studies addressed how prior information on component and comprehension skills can be used to increase the reliability and efficiency in measuring growth in reading comprehension, as well as the predictive utility of using prior information to estimate student risk for reading difficulty.
Scientific and Reading Research Team: John Sabatini (lead, ETS), Tenaha O'Reilly (ETS), Paul Deane (ETS), Barbara Foorman (FSU/FCRR), Richard Wagner (FSU/FCRR), Keith Millis (NIU), Gary Feng (ETS), Laura Halderman (ETS)
Validity Team: Michael Kane (lead, ETS), Bob Mislevy (ETS), Joanna Gorin (ETS)
Assessment Development Team: Barbara Elkins (lead, ETS), Patti Mendoza (ETS)
Measurement, Statistics, and Psychometric Team: Irwin Kirsch (lead, ETS), Kentaro Yamamoto (ETS), Frank Rijmen (ETS), Matthias von Davier (ETS), Chris Schatschneider (FSU/FCRR), Yaacov Petscher (FSU/FCRR), Joanna Gorin (ETS), Jonathan Weeks (ETS), Jonathan Steinberg
Education Outreach Team: Pascal (Pat) Forgione (lead, Center for K–12 Assessment and Performance Management)
Related IES Projects: Developing Reading Comprehension Assessments Targeting Struggling Readers (R305G040065)
Project Website: https://www.ets.org/research/topics/reading_for_understanding/
Publications and Products
Foorman, B., and Wanzek, J. (2015). Classroom Reading Instruction for All Students. In S.R. Jimerson, M.K. Burns and A.M. VanDerHeyden (Eds.), The Handbook of Response to Intervention: The Science and Practice of Multi-Tiered Systems of Support (2nd ed) (pp. 235–252). New York, NY: Springer Science, Inc.
Gorin, J., O'Reilly, T., Sabatini, J., Song, Y., and Deane, P. (2014). Measurement: Facilitating the Goal of Literacy. In B. Miller, P. McCardle, and R. Long (Eds.), Teaching Reading and Writing: Improving Instruction and Student Achievement (pp. 119–128). Baltimore: Paul H. Brookes Publishing Co.
Mislevy, R.J., and Sabatini, J. (2012). How Research on Reading and Research on Assessment are Transforming Reading Assessment (or if They Aren't, how They Ought to). In J.P. Sabatini, E.R. Albro, and T. O'Reilly (Eds.), Measuring Up: Advances in How We Assess Reading Ability (pp. 119–134). Lanham, MD: Rowan and Littlefield.
O'Reilly, T., Sabatini, J., & Wang, Z. (2018). Using Scenario-Based Assessments to Measure Deep Learning. In K. Millis, D. Long, J. Magliano, & K. Weimer (Eds.), Deep learning: Multi-disciplinary approaches (pp. 197-208). New York, NY: Routledge. Full text
Petscher, Y., Koon, S., and Kershaw, S. (2016). Using Latent Change Score Analysis to Model Co-Development in Fluency Skills. In K.D. Cummings and Y. Petscher (Eds). The Fluency Construct (pp.333-364). New York: Sage.
Petscher, Y., Logan, J.A.R., and Zhou, C. (2013). Extending Conditional Means Modeling: An Introduction to Quantile Regression. In Y. Petscher, C. Schatschneider, and D.L. Compton (Eds.), Applied Quantitative Analysis in Education and Social Sciences (pp. 3–33). New York: Routledge.
Sabatini, J., and O'Reilly, T. (2012). Conclusion to Reaching an Understanding: Innovations in how we View Reading Assessment. In J. Sabatini, T., O'Reilly, and L. Albro (Eds.), Reaching an Understanding: Innovations in How We View Reading Assessment (pp. 185–186). Lanham, MD: Rowman and Littlefield.
Sabatini, J., and O'Reilly, T. (2013). Rationale for a new Generation of Reading Comprehension Assessments. In B. Miller, L.E. Cutting, and P. McCardle (Eds.), Unraveling Reading Comprehension: Behavioral, Neurobiological and Genetic Components (pp. 100–111). Baltimore, MD: Brookes Publishing Company.
Sabatini, J., Petscher, Y., O'Reilly, T., and Truckenmiller, A. (2015). Improving Comprehension Assessment for Middle and High School Students: Challenges and Opportunities. In. D. Reed and K. Santi (Eds). Improving Reading Comprehension of Middle and High School Students, (pp. 119–151) New York: Springer.
Sheehan, K., and O'Reilly, T. (2012). The Case for Scenario-Based Assessments of Reading Competency. In J. Sabatini, T., O'Reilly, and L. Albro (Eds.), Reaching an Understanding: Innovations in How We View Reading Assessment (pp. 19–33). Lanham, MD: Rowman and Littlefield.
Shore, J. R., Wolf, M.K., O'Reilly, T., & Sabatini, J. (2017). Measuring 21st century reading comprehension through scenario-based assessments. In M. K. Wolf & Y.G. Butler (Eds). English language proficiency assessments for young learners (pp. 234-252). New York, NY: Routledge.
Journal article, monograph, or newsletter
Calhoon, M.B., and Petscher, Y. (2013). Individual and Group Sensitivity to Remedial Reading Program Design: Examining Reading Gains Across Three Middle School Reading Projects. Reading and Writing: An Interdisciplinary Journal, 26(4): 565–592. doi:10.1007/s11145-013-9426-7
Codding, R.S., Petscher, Y., and Truckenmiller, A. (2014). CBM (Curriculum-Based Measurement) Reading, Mathematics, and Written Expression at the Secondary Level: Examining Latent Composite Relations Among Indices and Unique Predictions With a State Achievement Test. Journal of Educational Psychology. doi:10.1037/a0037520
Feller, D. P., Magliano, J., Sabatini, J., O'Reilly, T., & Kopatich, R. D. (2020). Relations between Component Reading Skills, Inferences, and Comprehension Performance in Community College Readers. Discourse Processes, 57(5-6), 473-490, DOI: 10.1080/0163853X.2020.1759175
Foorman, B., Dombek, J., and Smith, K. (2016). Seven Elements Important to Successful Implementation of Early Literacy Intervention. New Directions in Child and Adolescent Development, (154), 49–65.
Foorman, B.R., Herrera, S., Petscher, Y., Mitchell, A., and Truckenmiller, A. (2015). The Structure of Oral Language and Reading and Their Relation to Comprehension in Kindergarten Through Grade 2. Reading and Writing: An Interdisciplinary Journal, 28(5), 655–681. http://dx.doi.org/10.1007/s11145-015-9544-5
Foorman, B.R., Koon, S., Petscher, Y., Mitchell, Al, Truckenmiller, A. (2015) Examining General and Specific Factors in the Dimensionality of Oral Language and Reading in 4th–10th Grades. Journal of Educational Psychology, 107(3), 884–899 http://dx.doi.org/10.1037/edu0000026
Foorman, B., Petscher, Y, and Bishop, M.D. (2012). The Incremental Variance of Morphological Knowledge to Reading Comprehension in Grades 3–10 Beyond Prior Reading Comprehension, Spelling, and Text Reading Efficiency. Learning and Individual Differences, 22(6): 792–798. doi:10.1016/j.lindif.2012.07.009
Foorman, B. R., Wu, Y. C., Quinn, J. M., & Petscher, Y. (2020). How do latent decoding and language predict latent reading comprehension: across two years in grades 5, 7, and 9?. Reading and Writing, 33(9), 2281-2309.
Goodwin, A., Petscher, Y., Carlisle, J., and Mitchell, A. (2017). Exploring the Dimensionality of Morphological Awareness of Adolescent Readers. Journal of Research in Reading, 40(1): 91-117. doi: 10.1111/1467-9817.12064.
Kent, S., Wanzek, J., Petscher, Y., Kim, Y.S., and Al Otaiba, S. (2014). Writing Fluency and Quality in Kindergarten and First Grade: The Role of Attention, Reading, Transcription, and Oral Language. Reading and Writing: An Interdisciplinary Journal, 27(7): 1163– 1188. doi:10.1007/s11145-013-9480-1
Kim, Y., Petscher, Y., and Foorman, B. (2015). The Unique Relation of Silent Reading Fluency to End-Of- Year Reading Comprehension: Understanding Individual Differences at the Student, Classroom, School, and District Levels. Reading and Writing, 28, 131–150.
McCarthy, K., Guerrero, T., Kent, K., Allen, L., McNamara, D., Chao, S., Steinberg, J., O'Reilly, T., & Sabatini, J. (2018). Comprehension in a scenario-based assessment: Domain and topic-specific background knowledge. Discourse Processes, 55(5-6), 510-524. Full text
Mitchell, A.M., Truckenmiller, A., and Petscher, Y. (2015). Computer-Adaptive Assessments: Fundamentals and Considerations. Communique, 43, 8.
O'Reilly, T., Sabatini, J., & Wang, Z. (2019). What you don't know won't hurt you, unless you don't know you're wrong. Reading Psychology, 40 (7), 638-677, DOI: 10.1080/02702711.2019.1658668
O'Reilly, T., Wang, Z., & Sabatini, J. (2019). How much knowledge is too little? When knowledge becomes a barrier to comprehension. Psychological Science, 30(9):1344-1351. Doi:10.1177/0956797619862276
O'Reilly T., Weeks, J., Sabatini, J., Halderman, L., and Steinberg, J. (2014). Designing Reading Comprehension Assessments for Reading Interventions: How a Theoretically Motivated Assessment Can Serve as an Outcome Measure. Educational Psychology Review, 26(3): 403–424. doi:10.1007/s10648-014-9269-z
O'Reilly, T., Sabatini, J., Bruce, K., Pillarisetti, S., and McCormick, C. (2012). Middle School Reading Assessment: Measuring What Matters Under an RTI Framework. Reading Psychology Special Issue: Response to Intervention, 33(1): 162–189. doi:10.1080/02702711.2012.631865
Petscher, Y., and Logan, J.A.R. (2014). Quantile Regression in the Study of Developmental Sciences. Child Development, 85(3): 861–881. doi:10.1111/cdev.12190
Petscher, Y., Connor, C.M., and Al Otaiba, S. (2012). Psychometric Analysis of the Diagnostic Evaluation of Language Variation Assessment. Assessment for Effective Intervention, 37(4): 244–251. doi:10.1177/1534508411413760
Petscher, Y., Cummings, K.D., Biancarosa, G., and Fien, H. (2013). Advanced (Measurement) Applications of Curriculum-Based Measurement in Reading. Assessment for Effective Intervention, 38(2): 71–75. doi:10.1177/1534508412461434
Petscher, Y., and Logan, J.A.R. (2014). Quantile Regression in the Study of Developmental Sciences. Child Development, 85(3): 861–881. doi:10.1111/cdev.12190
Petscher, Y., Mitchell, A., and Foorman, B. (2015). Improving the Reliability of Student Scores From Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure. Reading and Writing, 28, 31–56.
Petscher, Y., Cabell, S. Q., Catts, H. W., Compton, D. L., Foorman, B. R., Hart, S. A., ... & Wagner, R. K. (2020). How the Science of Reading Informs 21st-Century Education. Reading Research Quarterly, 55, S267-S282.
Piasta, S.B., Petscher, Y., and Justice, L.M. (2012). How Many Letters Should Preschoolers in Public Programs Know? The Diagnostic Efficiency of Various Preschool Letter-Naming Benchmarks for Predicting First-Grade Literacy Achievement. Journal of Educational Psychology, 104(4): 945–958. doi:10.1037/a0027757
Quinn, J.M., Wagner, R.K., Petscher, Y., and Lopez, D. (2015). Developmental Relations Between Vocabulary Knowledge and Reading Comprehension: A Latent Change Score Modeling Study. Child Development, 86(1), 159–175. DOI: 10.1111/cdev.12292
Sabatini, J., Halderman, L., O'Reilly, T., & Weeks, J. (2016). Assessing comprehension in kindergarten through third grade. Topics in Language Disorders, 36(4): 334–355. Full text
Sabatini, J., O'Reilly, T., Halderman, L., and Bruce, K. (2014). Integrating Scenario-Based and Component Reading Skill Measures to Understand the Reading Behavior of Struggling Readers. Learning Disabilities Research and Practice, 29(1): 36–43. doi:10.1111/ldrp.12028
Sabatini, J., O'Reilly, T., Halderman, L., and Bruce, K. (2014). Broadening the Scope of Reading Comprehension Using Scenario-Based Assessments: Preliminary Findings and Challenges. International Journal Topics in Cognitive Psychology, 114(4): 693–723. doi:10.4074/S0003503314004059
Sabatini, J., O'Reilly, T., Weeks, J., & Wang, Z. (2020). Engineering a 21st Century Reading Comprehension Assessment System Utilizing Scenario-Based Assessment Techniques. International Journal of Testing, 20(1): 1-23.
Solari, E., Petscher, Y., and Folsom, J.S. (2012). Differentiating Literacy Growth of ELL Students With LD From Other High-Risk Subgroups and General Education Peers: Evidence From Grades 3–10. Journal of Learning Disabilities, 47(4): 329–348. doi:10.1177/0022219412463435
Spencer, M., Muse, A., Wagner, R.K., Foorman, B. Petscher, Y., Schatschneider, C., Tighe, E., and Bishop, D. (2015). Examining the Underlying Dimensions of Morphological Awareness and Vocabulary Knowledge. Reading and Writing 28(7):959-988. doi: 10.1007/s11145-015-9557-0.
Spencer, M., Quinn, J.M., and Wagner, R.K. (2014). Specific Reading Comprehension Disability: Major Problem, Myth, or Misnomer? Learning Disabilities Research and Practice, 29(1): 3–9. doi:10.1111/ldrp.12024
Truckenmiller, A.J., Eckert, T.L., Codding, R.S., and Petscher, Y. (2014). Evaluating the Impact of Feedback on Elementary Aged Students' Fluency Growth in Written Expression: A Randomized Controlled Trial. Journal of School Psychology, 52(6): 531–548. doi:10.1016/j.jsp.2014.09.001
Truckenmiller, A. J., & Petscher, Y. (2020). The role of academic language in written composition in elementary and middle school. Reading and Writing, 33(1), 45-66.
Wang, Z., Sabatini, J., & O'Reilly, T. (2020). When Slower is Faster: Time Spent Decoding Novel Words Predicts Better Decoding and Faster Growth. Scientific Studies of Reading, 24(5), 397-410.
Madnani, N., Burstein, J., Sabatini, J., and O'Reilly, T. (2013). Automated Scoring of Summary-Writing Tasks Designed to Measure Reading Comprehension. In Proceedings of the 8th Workshop on Innovative Use of Natural Language Processing for Building Educational Applications (pp. 163–168). Atlanta: Association for Computational Linguistics.
Deane, P., Sabatini, J., Feng, G., Sparks, J., Song, Y., Fowles, M., O'Reilly, T., Jueds, K., Krovetz, R., and Foley, C. (2015). Key Practices in the English Language Arts: Linking Learning Theory, Assessment and Instruction. ETS Research Report Series RR-15-17.
O'Reilly, T., Deane, P., and Sabatini, J. (2015). Building and Sharing Knowledge Key Practice: What Do You Know, What Don't You Know, What Did You Learn?(Research Report No. RR–15–24).
Sabatini, J., Bruce, K., Steinberg, J., and Weeks, J. (2015). SARA Reading Components Tests, RISE Forms: Technical Adequacy and Test Design, 2nd Edition. Princeton, NJ: Educational Testing Service RR-15-32.