Skip Navigation

Literacy

Grantees

- OR -

Investigator

- OR -

Goals

- OR -

FY Awards

- OR -

Computer Based Assessment System for Reading (CBAS-R): Skills Analysis and Progress Monitoring

Year: 2012
Name of Institution:
University of Minnesota
Goal: Measurement
Principal Investigator:
Christ, Theodore
Award Amount: $1,599,950
Award Period: 4 years (6/1/2012-5/31/2016)
Award Number: R305A120086

Description:

Co-Principal Investigators: Barbara Taylor and David J. Weiss

Purpose: Despite the recent increase in the number of reading assessments created for use in classrooms, very few assessments are designed to provide ongoing instructionally-relevant information to teachers. Useful reading assessments may be those that help teachers to identify students who are at risk for reading problems. In this study, researchers intend to further develop and validate the Computer-Based Assessment System for Reading (CBAS-R), which is designed to be a highly-efficient tool to monitor student progress, evaluate students' strengths and weaknesses in reading, and gauge the effects of instruction. The assessment is aimed at guiding instruction in kindergarten through grade five classrooms. The CBAS-R has already been developed as a tri-annual assessment of broad reading skills in grades K–5. The new version of the CBAS-R will assess both broad and component reading skills including: concepts of print; phonological skills; automaticity/fluency; phonics; vocabulary; and comprehension. Because the CBAS-R will be a computer adaptive test, it will benefit teachers and students by decreasing the amount of time required to administer the assessment, thus leaving more time for instruction.

Project Activities: The goal of this project is to further develop, validate, and disseminate the CBAS-R. The project team will work with practicing teachers to develop a bank of about 2,600 items. Items for the bank will be designed for both skills analysis and progress monitoring, as well as to assess both broad and component reading skills. Items will be field tested with K–5 students in mobile computer labs, using a matrix sampling approach which will ensure each item is administered to about 500 students but no student will need to complete all 2,600 items. This item bank forms the foundation of the computer-adaptive measure. Individual students will typically take a 20-item adaptive test that is anticipated to take eight to twelve minutes to administer. Feedback collected from teachers and students will inform the validity and feasibility of the CBAS-R. The assessment will be disseminated via the CBAS Application Software System for web-based administration, thus providing the CBAS-R to teachers throughout the United States at low or no cost.

Products: The product of this project will be a fully developed and validated computer-adaptive assessment of reading skills in grades K–5 entitled the Computer-Based Assessment System for Reading (CBAS-R). Peer reviewed publications will also be produced.

Structured Abstract

Setting: School districts in rural, suburban, and urban Minnesota will participate in the project.

Sample: Participants will include approximately 13,000 students in grades K through 5.

Intervention: Researchers will develop and validate the CBAS-R, a computer-adaptive reading instrument aimed at skills assessment and progress monitoring for use with students in grades K–5. The CBAS-R has already been developed as a tri-annual assessment of broad reading skills in grades K–5. However, the new CBAS-R will assess both broad and component reading skills including: concepts of print; phonological skills; automaticity/fluency; phonics; vocabulary; and comprehension. Assessment score reports will be automatically generated for teachers to aid in identifying students' strengths and weaknesses, monitor the progress of students' reading skills, and inform instructional choices. The project team anticipates a 20-item computer-adaptive measure that will take eight to twelve minutes to administer. Previous feasibility work with the CBAS-R indicates that teachers believed the assessment was not disruptive to classroom activities and that it was beneficial to instruction.

Research Design and Methods: The CBAS-R will be developed in four phases as follows: Phase (1) item development, parameterization, and modeling; (2) establish procedures for administrations aimed at skills analysis and progress monitoring; (3) evaluate the validity and feasibility of the CBAS-R; and (4) disseminate the CBAS-R materials and research. The project team will work with a group of teachers to develop the items. A panel of experts will review and evaluate the items in order to optimize accuracy, content relevance, fairness, and difficulty. The expert panel will also conduct a sensitivity review to ensure the items are appropriate for youth from all racial, ethnic, and gender groups. Following item development and revisions, the items will be field tested with a group of approximately 11,250 students in grades K–5. Phase 2 will involve the development of administration procedures for the skills analysis and progress monitoring features of the CBAS-R. This process will enable the project team to choose the items that provide the most information regarding students' strengths and weaknesses to include in the final version. Additionally, items will be balanced for content, and criteria for terminating the assessment will be developed. Phase 3 of the project will involve validity and feasibility studies. Validity will be assessed psychometrically, including examining the correlations between CBAS-R scores and other measures of reading achievement. Feasibility studies will include assessing both the ease of test administration, and teachers' ease of interpreting and using the results of the CBAS-R. The final phase, Phase 4, will involve the final development and dissemination of the CBAS administration software.

Control Condition: Due to the nature of the research design, there is no control condition.

Key Measures: Criterion-related validity of the CBAS-R will be assessed using students' reading scores from the Minnesota State reading test, the Measures of Academic Progress, the Curriculum-Based Measurement of Reading, and the Dynamic Indicators of Basic Early Literacy Skills. Interviews, rating scales, and observations will be used and allow the project team to assess feasibility of the assessment in classrooms.

Data Analytic Strategy: Each of the first three phases of the project will involve different data analytic strategies. Phase 1 will use a three-parameter Item Response Theory (IRT) design to model the parameters of item difficulty, item discrimination, and an estimate for guessing. Additionally, the project team will use a bifactor model (which assumes each item loads on a general factor and only one component factor) and full-metric concurrent calibration, so that the final CBAS-R will produce estimates for broad and component reading skills on the same scale. Phase 2 will include a Differential Item Functioning analysis which will allow the project to determine if the CBAS-R items function the same for members of various racial, ethnic, and gender groups. Finally, Phase 3 will include tests of validity and reliability using IRT and testing for correlations with reading scores from a variety of assessments already in use.

Products and Publications

Journal article, monograph, or newsletter

Kendeou, P., McMaster, K.L., Christ, T.J. (2016). Reading Comprehension: Core Components and Processes. Policy Insights from the Behavioral and Brain Sciences, 3, 62–69.