|Title:||Assessing Reading for Understanding: A Theory-based, Developmental Approach|
|Principal Investigator:||Sabatini, John||Awardee:||Educational Testing Service (ETS)|
|Program:||Reading for Understanding Research Initiative [Program Details]|
|Award Period:||5 years||Award Amount:||$14,824,226|
|Goal:||Multiple Goals||Award Number:||R305F100005|
Co-Principal Investigator: Tenaha O'Reilly
Purpose: This team is developing and evaluating a new system of assessments that are aligned with current theoretical constructs and empirical findings pertaining to both reading comprehension and performance moderators; are sensitive to changes in development in reading comprehension; emphasize strategic reading processes empirically supported in the literature; provide greater information for guiding instruction (especially for students struggling to reach proficiency); and are comprised of texts and tasks that represent a range of purposeful literacy activities in which 21st century students are expected to read texts for understanding. Two types of assessments are being developed. The first type of assessment, called GISA (Global, Integrated Scenario-Based Assessment), is a purpose-driven, scenario-based assessment that requires students to synthesize and evaluate information from multiple sources in order to solve problems, make decisions, or produce products to attain a defined goal The second type of assessment is a set of component skill measures designed to provide additional information about non-proficient readers to help identify or rule out potential bases for comprehension difficulties and predict developmental trajectories. The component measures assess proficiency in print skills, vocabulary, and oral language as they relate to growth in reading proficiency. Formative assessments (including progress monitoring and diagnostic measures) are being developed and evaluated to determine how well they predict performance on GISA as well as indicating a reader’s areas of strengths and weaknesses. Project partners include researchers at Educational Testing Service (ETS), Florida State University/Florida Center for Reading Research (FSU/FCRR), Arizona State University (ASU), and Northern Illinois University (NIU). ETS collaborates with schools and district personnel to inform the designs and provide feedback on the feasibility and utility of the assessments. Collaborators include superintendents, principals, school or district assessment specialists, English learner specialists, and reading teacher and/or curriculum specialists.
Grade Span: Prekindergarten through Grade 12
School Partners: Orange County and Lake County Schools (Florida), Brockton Public Schools (Massachusetts), Trenton Public and Catholic Schools (New Jersey), Baltimore City Public Schools (Maryland), as well as various schools across the country.
Description: The conceptual framework for the assessments is based on principles derived from contemporary models of reading, reading acquisition, cognitive science, and learning. The principles highlight the important roles of print skills, oral language, vocabulary, model building, background knowledge, motivation, purpose-driven comprehension, self-regulation/metacognition, and learning in digital and print environments.
A summary of the principles is as follows: (a) print skills and linguistic comprehension are both necessary, but neither are sufficient for reading for understanding; (b) oral language continues to be a reciprocal predictor and influence on print comprehension; (c) breadth and depth of vocabulary (and word learning) are critically related to both reading for understanding and growth in world knowledge; (d) readers model/represent understanding of texts at multiple levels from simple, literal meanings to complex interpretations; (e) reading is typically a purposeful activity that must be regulated via metacognitive and strategic processes; (f) skilled readers integrate information across multiple texts; (g) fluency across an increasing array of information-communication technologies and print genres is characteristic of reading proficiency; and (h) the development of proficiency is marked by a continual increase in sophistication of skills in understanding increasingly complex texts as demanded by school and society.
The research team works closely with educators to maximize the practical utility of the assessments for the classroom teacher and literacy professionals through attention to assessment design, details of implementation, and score reporting formats.
Development of the assessments is based on this conceptual framework and is organized around the following studies:
Framework: An assessment framework based on the conceptual model was developed to outline the purpose of the assessment, the constructs measured, and the types of tasks to guide development.
Developmental issues relating component skills to reading comprehension: A number of existing tests of decoding/word recognition, vocabulary, and oral comprehension/language are currently being investigated to identify the component skills that will be targeted for new assessments of component skills. Activities focus on identifying which component skills are best suited for use in GISA and how they interrelate as students develop comprehension skills. A two-year longitudinal design is currently underway to study a subset of students from prekindergarten through grade 12.
Iterative studies of GISA prototypes: A series of studies investigate how well GISA operationalizes the comprehension constructs as defined by the assessment framework. These studies investigate how and whether different assessment approaches and item/task designs capture the intended variance related to comprehension constructs, as well as issues of response format, scoring models, feasibility and utility. Special efforts are made to include a range of students from a variety of locales and regions across the country, including English language learners and students with special needs, to address the adequacy of the assessments for diverse populations. Performance moderators, such as motivation/engagement, background knowledge and self regulation/metacognition, are also being studied by isolating their effects in experimental studies.
Large-scale field study to evaluate the GISA and component skills tests: Ongoing studies are collecting evidence of validity in relation to expected structural relationships among the measures, stability and change of scores over time, external validity, and generalizability of the assessment results.
Evaluation of formative assessments as predictors of growth and risk: Studies address how prior information on component and comprehension skills can be used to increase the reliability and efficiency in measuring growth in reading comprehension, as well as the predictive utility of using prior information to estimate student risk for reading difficulty.
Scientific and Reading Research Team: John Sabatini (lead, ETS), Tenaha O’Reilly (ETS), Paul Deane (ETS), Barbara Foorman (FSU/FCRR), Richard Wagner (FSU/FCRR), Keith Millis (NIU), Gary Feng (ETS), Laura Halderman (ETS)
Validity Team: Michael Kane (lead, ETS), Bob Mislevy (ETS), Joanna Gorin (ETS)
Assessment Development Team: Barbara Elkins (lead, ETS), Patti Mendoza (ETS)
Measurement, Statistics, and Psychometric Team: Irwin Kirsch (lead, ETS), Kentaro Yamamoto (ETS), Frank Rijmen (ETS), Matthias von Davier (ETS), Chris Schatschneider (FSU/FCRR), Yaacov Petscher (FSU/FCRR), Joanna Gorin (ETS), Jonathan Weeks (ETS), Jonathan Steinberg
Education Outreach Team: Pascal (Pat) Forgione (lead, Center for K–12 Assessment and Performance Management)
Related IES Projects: Developing Reading Comprehension Assessments Targeting Struggling Readers (R305G040065)
Publications from this project:
Calhoon, M.B., and Petscher, Y. (2013). Individual and Group Sensitivity to Remedial Reading Program Design: Examining Reading Gains Across Three Middle School Reading Projects. Reading and Writing: An Interdisciplinary Journal, 26(4): 565–592. doi:10.1007/s11145-013-9426-7
Codding, R.S., Petscher, Y., and Truckenmiller, A. (2014). CBM (Curriculum-Based Measurement) Reading, Mathematics, and Written Expression at the Secondary Level: Examining Latent Composite Relations Among Indices and Unique Predictions With a State Achievement Test. Journal of Educational Psychology. doi:10.1037/a0037520
Deane, P., Sabatini, J., Feng, G., Sparks, J., Song, Y., Fowles, M., O'Reilly, T., Jueds, K., Krovetz, R., and Foley, C. (2015). Key Practices in the English Language Arts: Linking Learning Theory, Assessment and Instruction. ETS Research Report Series RR-15-17.
Foorman, B., Dombek, J., and Smith, K. (in press). Seven Elements Important to Successful Implementation of Early Literacy Intervention. In B. Foorman (Ed.), Challenges and Solutions to Implementing Effective Reading Intervention in Schools. New Directions in Child and Adolescent Development, 152.
Foorman, B.R., Herrera, S., Petscher, Y., Mitchell, A., and Truckenmiller, A. (2015). The Structure of Oral Language and Reading and Their Relation to Comprehension in Kindergarten Through Grade 2. Reading and Writing: An Interdisciplinary Journal, 28(5), 655–681. http://dx.doi.org/10.1007/s11145-015-9544-5
Foorman, B.R., Koon, S., Petscher, Y., Mitchell, Al, Truckenmiller, A. (2015) Examining General and Specific Factors in the Dimensionality of Oral Language and Reading in 4th–10th Grades. Journal of Educational Psychology, 107(3), 884–899 http://dx.doi.org/10.1037/edu0000026
Foorman, B., Petscher, Y, and Bishop, M.D. (2012). The Incremental Variance of Morphological Knowledge to Reading Comprehension in Grades 3–10 Beyond Prior Reading Comprehension, Spelling, and Text Reading Efficiency. Learning and Individual Differences, 22(6): 792–798. doi:10.1016/j.lindif.2012.07.009
Foorman, B., and Wanzek, J. (2015). Classroom Reading Instruction for All Students. In S.R. Jimerson, M.K. Burns and A.M. VanDerHeyden (Eds.), The Handbook of Response to Intervention: The Science and Practice of Multi-Tiered Systems of Support (2nd ed) (pp. 235–252). New York, NY: Springer Science, Inc.
Goodwin, A., Petscher, Y., Carlisle, J., and Mitchell, A. (2015). Unraveling a Complex Construct: Exploring the Dimensionality of Morphological Awareness of Adolescent Readers. Journal of Research in Reading, doi: 10.1111/1467-9817.12064.
Gorin, J., O'Reilly, T., Sabatini, J., Song, Y., and Deane, P. (2014). Measurement: Facilitating the Goal of Literacy. In B. Miller, P. McCardle, and R. Long (Eds.), Teaching Reading and Writing: Improving Instruction and Student Achievement (pp. 119–128). Baltimore: Paul H. Brookes Publishing Co.
Kent, S., Wanzek, J., Petscher, Y., Kim, Y.S., and Al Otaiba, S. (2014). Writing Fluency and Quality in Kindergarten and First Grade: The Role of Attention, Reading, Transcription, and Oral Language. Reading and Writing: An Interdisciplinary Journal, 27(7): 1163– 1188. doi:10.1007/s11145-013-9480-1
Kim, Y., Petscher, Y., and Foorman, B. (2015). The Unique Relation of Silent Reading Fluency to End-Of- Year Reading Comprehension: Understanding Individual Differences at the Student, Classroom, School, and District Levels. Reading and Writing, 28, 131–150.
Madnani, N., Burstein, J., Sabatini, J., and O'Reilly, T. (2013). Automated Scoring of Summary-Writing Tasks Designed to Measure Reading Comprehension. In Proceedings of the 8th Workshop on Innovative Use of Natural Language Processing for Building Educational Applications (pp. 163–168). Atlanta: Association for Computational Linguistics.
Mislevy, R.J., and Sabatini, J. (2012). How Research on Reading and Research on Assessment are Transforming Reading Assessment (or if They Aren't, how They Ought to). In J.P. Sabatini, E.R. Albro, and T. O'Reilly (Eds.), Measuring Up: Advances in How We Assess Reading Ability (pp. 119–134). Lanham, MD: Rowan and Littlefield.
Mitchell, A.M., Truckenmiller, A., and Petscher, Y. (2015). Understanding Computer Adaptive Assessments: Fundamentals and Considerations for School Psychologists. Communique, 43, 8.
O'Reilly, T., Deane, P., and Sabatini, J. (2015). Building and Sharing Knowledge Key Practice: What Do You Know, What Don't You Know, What Did You Learn? (Research Report No. RR–15–24).
O'Reilly, T., Sabatini, J., Bruce, K., Pillarisetti, S., and McCormick, C. (2012). Middle School Reading Assessment: Measuring What Matters Under an RTI Framework. Reading Psychology Special Issue: Response to Intervention, 33(1): 162–189. doi:10.1080/02702711.2012.631865
O'Reilly T., Weeks, J., Sabatini, J., Halderman, L., and Steinberg, J. (2014). Designing Reading Comprehension Assessments for Reading Interventions: How a Theoretically Motivated Assessment Can Serve as an Outcome Measure. Educational Psychology Review, 26(3): 403–424. doi:10.1007/s10648-014-9269-z
Petscher, Y., Connor, C.M., and Al Otaiba, S. (2012). Psychometric Analysis of the Diagnostic Evaluation of Language Variation Assessment. Assessment for Effective Intervention, 37(4): 244–251. doi:10.1177/1534508411413760
Petscher, Y., Cummings, K.D., Biancarosa, G., and Fien, H. (2013). Advanced (Measurement) Applications of Curriculum-Based Measurement in Reading. Assessment for Effective Intervention, 38(2): 71–75. doi:10.1177/1534508412461434
Petscher, Y., Koon, S., and Kershaw, S. (In press). Using Latent Change Score Models to Understand Growth in Fluency. In K.D. Cummings and Y. Petscher (Eds). The Fluency Construct. New York: Sage.
Petscher, Y., and Logan, J.A.R. (2014). Quantile Regression in the Study of Developmental Sciences. Child Development, 85(3): 861–881. doi:10.1111/cdev.12190
Petscher, Y., Logan, J.A.R., and Zhou, C. (2013). Extending Conditional Means Modeling: An Introduction to Quantile Regression. In Y. Petscher, C. Schatschneider, and D.L. Compton (Eds.), Applied Quantitative Analysis in Education and Social Sciences (pp. 3–33). New York: Routledge.
Petscher, Y., Mitchell, A., and Foorman, B. (2015). Improving the Reliability of Student Scores From Speeded Assessments: An Illustration of Conditional Item Response Theory Using a Computer-Administered Measure. Reading and Writing, 28, 31–56.
Piasta, S.B., Petscher, Y., and Justice, L.M. (2012). How Many Letters Should Preschoolers in Public Programs Know? The Diagnostic Efficiency of Various Preschool Letter-Naming Benchmarks for Predicting First-Grade Literacy Achievement. Journal of Educational Psychology, 104(4): 945–958. doi:10.1037/a0027757
Quinn, J.M., Wagner, R.K., Petscher, Y., and Lopez, D. (2015). Developmental Relations Between Vocabulary Knowledge and Reading Comprehension: A Latent Change Score Modeling Study. Child Development, 86(1), 159–175. DOI: 10.1111/cdev.12292
Quinn, J.M., Wagner, R.K., Menzel, A.J., Petscher, Y., Schatschneider, C., McArdle, J.J. (2016). Developmental Relations Between Vocabulary Knowledge and Reading Comprehension: A Large Scale Study of At-Risk Readers. Developmental Psychology.
Sabatini, J., Bruce, K., Steinberg, J., and Weeks, J. (2015). SARA Reading Components Tests, RISE Forms: Technical Adequacy and Test Design, 2nd Edition. Princeton, NJ: Educational Testing Service RR-15-32.
Sabatini, J., Halderman, L., O'Reilly, T., and Weeks, J. (in press). Assessing Comprehension in Kindergarten through Third Grade. Topics in Language Disorders.
Sabatini, J., and O'Reilly, T. (2012). Conclusion to Reaching an Understanding: Innovations in how we View Reading Assessment. In J. Sabatini, T., O'Reilly, and L. Albro (Eds.), Reaching an Understanding: Innovations in How We View Reading Assessment (pp. 185–186). Lanham, MD: Rowman and Littlefield.
Sabatini, J., and O'Reilly, T. (2013). Rationale for a new Generation of Reading Comprehension Assessments. In B. Miller, L.E. Cutting, and P. McCardle (Eds.), Unraveling Reading Comprehension: Behavioral, Neurobiological and Genetic Components (pp. 100–111). Baltimore, MD: Brookes Publishing Company.
Sabatini, J., O'Reilly, T., Halderman, L., and Bruce, K. (2014). Integrating Scenario-Based and Component Reading Skill Measures to Understand the Reading Behavior of Struggling Readers. Learning Disabilities Research and Practice, 29(1): 36–43. doi:10.1111/ldrp.12028
Sabatini, J., O'Reilly, T., Halderman, L., and Bruce, K. (2014). Broadening the Scope of Reading Comprehension Using Scenario-Based Assessments: Preliminary Findings and Challenges. International Journal Topics in Cognitive Psychology, 114(4): 693–723. doi:10.4074/S0003503314004059
Sabatini, J., Petscher, Y., O'Reilly, T., and Truckenmiller, A. (2015). Improving Comprehension Assessment for Middle and High School Students: Challenges and Opportunities. In. D. Reed and K. Santi (Eds). Improving Reading Comprehension Of Middle and High School Students, (pp. 119–151) New York: Springer.
Sheehan, K., and O'Reilly, T. (2012). The Case for Scenario-Based Assessments of Reading Competency. In J. Sabatini, T., O'Reilly, and L. Albro (Eds.), Reaching an Understanding: Innovations in How We View Reading Assessment (pp. 19–33). Lanham, MD: Rowman and Littlefield.
Shore, J., Wolf, M., O'Reilly, T., and Sabatini, J. (in press). Measuring 21st Century Reading Comprehension Through Scenario-Based Assessments. Innovations in ELL assessment.
Shore, J., Wolf, M. K., O'Reilly, T., and Sabatini, J. P. (in press). Measuring 21st Century Reading Comprehension Through Scenario-Based Assessments. In M. K. Wolf, & Y. G. Butler (Eds.), English Language Proficiency Assessments For Young Learners. New York, NY: Routledge.
Solari, E., Petscher, Y., and Folsom, J.S. (2012). Differentiating Literacy Growth of ELL Students With LD From Other High-Risk Subgroups and General Education Peers: Evidence From Grades 3–10. Journal of Learning Disabilities, 47(4): 329–348. doi:10.1177/0022219412463435
Spencer, M., Muse, A., Wagner, R.K., Foorman, B. Petscher, Y., Schatschneider, C., Tighe, E., and Bishop, D. (in press). What Does it Mean to Know a Word? Examining the Underlying Dimensions of Morphological Awareness and Vocabulary Knowledge. Reading and Writing.
Spencer, M., Quinn, J.M., and Wagner, R.K. (2014). Specific Reading Comprehension Disability: Major Problem, Myth, or Misnomer? Learning Disabilities Research and Practice, 29(1): 3–9. doi:10.1111/ldrp.12024
Truckenmiller, A.J., Eckert, T.L., Codding, R.S., and Petscher, Y. (2014). Evaluating the Impact of Feedback on Elementary Aged Students' Fluency Growth in Written Expression: A Randomized Controlled Trial. Journal of School Psychology, 52(6): 531–548. doi:10.1016/j.jsp.2014.09.001