Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Developing Accessible and Valid Reading Assessments: A Research Based Solution
Center: NCSER Year:
Principal Investigator: Laitusis, Cara Cahalan Awardee: Educational Testing Service (ETS)
Program: Unsolicited and Other Awards: Special Education Research      [Program Details]
Award Period: 10/1/2004 to 9/30/2009 Award Amount: $4,601,225*
Award Number: H324F040001

Funded through OSEP prior to the establishment of NCSER

Purpose: Comprehension is the ultimate educational goal of reading. There are subcomponent skills to reading that are essential building blocks to mastery of reading comprehension. These are: phonemic awareness, phonics, fluency, vocabulary, and comprehension strategies. Identifying subcomponent profiles of weaknesses and strengths is essential for developing individual interpretive, descriptive, and diagnostic reports for students with learning disabilities that affect reading. This project will design and develop an accessible reading proficiency assessment that isolates the components of reading into different test sections to allow scores to be reported by component for improved student accountability testing, progress monitoring, and instructional planning.

Project Activities: Project activities will be focused in the following four areas. First, a definition of reading proficiency that addresses the five components of reading will be developed through four phases. This definition will be documented using research on reading theory, will align with state standards for reading, and will be revised based on input from focus groups of national experts. Second, a research agenda will be developed to examine how different reading constructs are affected by different testing accommodations and to explore the development of new items that effectively assess the components of reading. This agenda will extend over a two-year period and will include multiple studies of existing assessments as well as developing a new assessment. Third, principles and guidelines for designing accessible reading tests will be developed. This will include defining constructs, the fundamentals of test design, developing test items and the tests themselves, field testing, and revisions. Focus groups with key stakeholders will be conducted to ensure that the principles and guidelines are responsive to the needs of students with reading disabilities and to ensure the comprehensiveness and validity of the principles and guidelines. Fourth, an accessible, diagnostic, reading proficiency assessment will be developed and field tested.

Products: Project outcomes will include:

  1. A definition of the construct of reading proficiency that can be used as a basis for research and test development;
  2. A program of research on the assessment of reading proficiency that will address issues of accessibility, validity, and comparability for students with reading disabilities;
  3. Guidelines and principles for making large-scale reading assessments more accessible for students with disabilities; and
  4. The design, development, and field testing of an accessible, diagnostic reading assessment.

Setting: Primarily in large-scale assessment programs in public school systems. Some studies conducted by this project will make use of extant data from a large state testing system, and some will involve experimental tests of reading assessment accessibility features.

Population: The number of subjects to be included in the field test has not yet been determined. Students with learning disabilities at 4th and 8th grades are the primary focus, although students with other disabilities and at other grades may be included in some studies.

Intervention: The intervention of interest is an assessment of reading proficiency with accessibility features built into the design.

Research Design and Methods: A variety of research designs and methodologies are being employed in the various studies being conducted by this project, including analyses of extant data from state-wide assessments (differential item functioning, factor analysis, etc.); differential boost designs to test the effects of accessibility features; and cognitive labs or "think aloud" methods to study student responses to test items.

Control Condition: Two control conditions are being used in the differential boost studies, including assessments without accessibility features and students without disabilities.

Key Measures: Various assessment instruments are being employed in the studies conducted by this project. These include the California Standardized Testing and Reporting (STAR) assessment used in factor analyses, and the Gates-MacGinitie Reading Tests (GMRT) used in a differential boost study with selected Woodcock-Johnson reading subtests as covariates. The project will conclude by designing and field testing a reading assessment to demonstrate accessibility features.

Data Analytic Strategy: Various analysis strategies are being employed. Large sets of assessment data are being analyzed by means of differential item functioning (DIF) analyses and factor analysis. The differential boost study data are being analyzed using repeated-measures ANCOVA, factor analysis, DIF and differential distractor functioning (DDF) analyses.

Products and Publications

Book chapter

Laitusis, C.C., Cook, L.L., Buzick, H.M., and Stone, E. (2011). Adaptive Testing Options for Accountability Assessments. In M. Russell, and M. Kavanaugh (Eds.), Assessing Students in the Margins: Challenges, Strategies, and Techniques (pp. 291–310). Charlotte, NC: Information Age Publishing.

Journal article, monograph, or newsletter

Cook, L.L., Eignor, D., Sawaki, Y., Steinberg, J., and Cline, F. (2010). Using Factor Analysis to Investigate Accommodations Used by Students With Disabilities on an English-Language Arts Assessment. Applied Measurement in Education, 23(2): 187–208. doi:10.1080/08957341003673831

Laitusis, C.C. (2008). State Reading Assessments and the Inclusion of Students With Dyslexia. Perspectives on Language and Literacy, 33: 31–33.

Laitusis, C.C. (2010). Examining the Impact of Audio Presentation on Tests of Reading Comprehension. Applied Measurement in Education, 23(2): 153–167. doi:10.1080/08957341003673815

Nongovernment report, issue brief, or practice guide

Cline, F., Johnstone, C., and King, T. (2006). Focus Group Reactions to Three Definitions of Reading. Minneapolis, MN: National Accessible Reading Assessment Projects. Full text

King, T.C., and Laitusis, C.C. (2008). Sample Cognitive Interview Protocol. Princeton, NJ: Educational Testing Service. Full text

Laitusis, C.C., and Cook, L.L. (2008). Reading Aloud as an Accommodation for a Test of Reading Comprehension. Princeton, NJ: Educational Testing Service. Full text

National Accessible Reading Assessment Projects (2006). Defining Reading Proficiency for Accessible Large-Scale Assessments: Some Guiding Principles and Issues. Minneapolis, MN: National Accessible Reading Assessment Projects. Full text

Thurlow, M.L., Laitusis, C.C., Dillon, D.R., Cook, L.L., Moen, R.E., Abedi, J., and O'Brien, D.G. (2009). Accessibility Principles for Reading Assessments. Minneapolis, MN: National Accessible Reading Assessment Projects. Full text

* The dollar amount includes funds from the Office of Special Education Programs (OSEP) and NCSER.