Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Reliability and Validity Evidence for Progress Measures in Reading
Center: NCSER Year: 2010
Principal Investigator: Tindal, Gerald Awardee: University of Oregon
Program: Reading, Writing, and Language      [Program Details]
Award Period: 4 years Award Amount: $1,596,640
Type: Measurement Award Number: R324A100014

Purpose: This research is designed to validate decision-making based on use of easyCBM, a free online curriculum-based benchmark and progress monitoring assessment system that documents early literacy acquisition. Developed in 2006, easyCBM currently has registered users from all 50 states using the system as part of their Response to Intervention (RtI) programming; yet the technical adequacy of the system has not yet been sufficiently studied.

Project Activities: Researchers plan to collect reliability, content, criterion, and construct validity evidence of easyCBM. They plan to follow students over time to establish the validity of interpretations made from reading progress measures. National norms will also be established so students' relative performance may be interpreted.

Products: Outcomes will be disseminated through published papers and technical reports on the reliability, validity, feasibility, and interpretation of scores for the easyCBM in monitoring reading progress.

Structured Abstract

Setting: In the first three years of the project, students in elementary school classrooms distributed across Oregon will be assessed. In the fourth year, a stratified national sample of students will be assessed to provide data for creating national norms. Overall, it is anticipated that approximately 25,000 students will participate in the study.

Population: Kindergarten through fifth grade students encompassing a wide range of socio-economic, ethnic, and language backgrounds.

Intervention: The easyCBM system was developed by researchers in the College of Education — University of Oregon with funding from the Office of Special Education Programs at the U.S. Department of Education. The easyCBM system provides both universal screening assessments for fall, winter, and spring administration and 17 alternate forms of a variety of progress monitoring reading measures designed for use in elementary school settings. Six measures are available for general education teachers, reading specialists, or interventionists interested in monitoring the progress of their students in developing reading skills: phoneme segmenting, letter names, letter sounds, word reading fluency, passage reading fluency, and comprehension.

Research Design and Methods: The systematic evaluation of the assessments begins with the examination of reliability of all measures using classical test theory to document alternate form and test-retest methods, as well as generalizability theory to understand the influence of tasks and testing occasions on assessments. Validity will be studied by collecting evidence on the relations among subtest measures (internal structure) as well as the relations among subtest measures and student and school characteristics. Criterion-related evidence will be collected using other reading assessments.

Control Condition: There is no control condition.

Key Measures: In addition to the six easyCBM measures outlined above, researchers will use performance the Comprehensive Tests of Phonological Processing, Dynamic Indicators of Basic Early Literacy Skills, and the Gates-MacGinitie.

Data Analytic Strategy: Reliability will be examined using test-retest, internal consistency, alternate form, and generalizability analyses. Internal structure will be examined using linear regression and invariance testing using structural equation modeling. Criterion-related validity will be examined using linear regression and correlations. Hierarchical linear modeling will be used to evaluate the adequacy of the test for measuring growth in each grade across the school year.


Book, edition specified

Tindal, G., and Alonzo, J. (2016). Technology-Based Assessment and Problem Analysis. (2nd ed.). New York: Springer.

Journal article, monograph, or newsletter

Anderson, D., Irvin, P., Alonzo, J., and Tindal, G. (2014). Gauging Item Alignment Through Online Systems While Controlling for Rater Effects. Educational Measurement: Issues and Practice. doi:10.1111/emip.12038

Basaraba, D., Yovanoff, P., Alonzo, J., and Tindal, G. (2012). Examining the Structure of Reading Comprehension: Do Literal, Inferential, and Evaluative Comprehension Truly Exist?. Reading and Writing: An Interdisciplinary Journal, 26(3): 349–379. doi:10.1007/s11145–012–9372–9

Kamata, A., Nese, J.F.T., Patarapichayatham, C., and Lai, C.F. (2013). Modeling Nonlinear Growth With Three Data Points: Illustration With Benchmarking Data. Assessment for Effective Intervention. Assessment for Effective Intervention, 38(2): 105–116.

Nese, J.F.T., Biancarosa, G., Anderson, D., Lai, C.F., Alonzo, J., and Tindal, G. (2012). Within-Year Oral Reading Fluency With CBM: A Comparison of Models. Reading and Writing: An Interdisciplinary Journal, 25(4): 887–915. doi:10.1007/s11145–011–9304–0

Nese, J.F.T., Biancarosa, G., Cummings, K., Kennedy, P., Alonzo, J., and Tindal, G. (2013). In Search of Average Growth: Describing Within-Year Oral Reading Fluency Growth for Grades 1–8. Journal of School Psychology, 51(5): 625–642. doi:10.1016/j.jsp.2013.05.006

Nese, J.F.T., Park, J., Alonzo, J., and Tindal, J. (2011). Applied Curriculum-Based Measurement as a Predictor of High-Stakes Assessment: Implications for Researchers and Teachers. Elementary School Journal, 111: 608–624. doi:10.1086/659034 Full text

Smith, J.L.M., Cummings, K.D., Nese, J.F.T., Alonzo, J., Fien, H., and Baker, S. (2014). The Relation of Word Reading Fluency Initial Level and Gains With Reading Outcomes. School Psychology Review, 43(1).