Skip Navigation

The Effects of the Content Literacy Continuum on Adolescent Students' Reading Comprehension and Academic AchievementThe Effects of the Content Literacy Continuum on Adolescent Students' Reading Comprehension and Academic Achievement

Regional need and study purpose

Some 70 percent of grade 8 students scored below proficient on the 2007 National Assessment of Educational Progress (NAEP) in reading (Lee, Grigg, and Donahue 2007). Grade 8 students in the Regional Educational Laboratory (REL) Midwest Region states (Illinois, Indiana, Iowa, Michigan, Minnesota, Ohio, and Wisconsin ) fared no better, with 64–72 percent scoring below proficient in reading. High schools face the challenge of providing struggling readers with instruction in reading when they arrive in grade 9 while also providing content instruction in core subjects.

Many high schools address students' lack of literacy skills by placing struggling readers in standalone reading classes. Although such classes can help adolescents develop the necessary skills, students' rate of progress is often too slow to bring them up to levels of proficiency that can support the demands of core subject curricula at the high school level (Corrin et al. 2008).

In contrast to the approach of relying on reading classes to provide struggling readers with needed literacy support, researchers at the University of Kansas Center for Research on Learning created an approach that augments these reading classes with literacy-related instruction embedded in all core subject courses. The separate instructional practices, routines, and embedded learning strategies that make up the cross-content area aspect of this intervention have undergone considerable testing and refinement over 30 years, suggesting that CLC can be implemented with fidelity and that CLC shows promise for improving the literacy and overall content knowledge of high school students (see Schumaker and Deshler 2003). However, very few of the studies on the routines and learning strategies have used research designs that controlled for other potential causal factors (see Deshler and Schumaker 2006 for summaries), and the entire CLC framework has never been tested using a rigorous evaluation methodology.

This research study attempts to provide that critical test. The study aims to measure the causal impact of CLC on students' achievement across content areas using a cluster randomized trial that randomly assigns eligible high schools within each participating school district to either implement the CLC framework or to continue using their current approach to improve literacy skills. The study attempts to answer two research questions:

This project is also designed to describe changes in teacher instruction, the contexts in which CLC is being implemented, the contrast between the CLC approach and other approaches used in participating districts, and the degree to which schools are implementing the program as recommended by its developers.

Intervention description

CLC addresses literacy across the entire curriculum by incorporating literacy activities in a consistent way in standalone reading classes and in all core content classes. CLC is not a single intervention for fostering content literacy at the secondary level, but rather a set of interventions to provide students with gradually more intensive, systematic, and explicit instruction in literacy content, strategies, and skills adapted to their needs.

The study considers three of the five levels that make up CLC:1

Schools that adopt CLC are assigned a site coordinator who delivers most of the professional development on content enhancement routines and learning strategies to teachers of core content areas. Site coordinators are former teachers from schools that successfully adopted and implemented CLC and who have implemented CLC in other schools. Site coordinators interview administrators and teachers to better understand the culture and structures within the school and to build relationships with staff and knowledge of school context.

Taking a coaching approach to professional development, site coordinators work with the schools' literacy leadership teams (a cross-disciplinary team of teachers or curriculum leaders) to develop priorities, plan the phased roll-out of routines and strategies, establish level 3 reading classes, and devise a professional development schedule that works best for the schools. While some schools have conducted some of the professional development as a two- or three-day summer institute, most professional development is provided each month during two or three-day visits by the site coordinators. Alone or with a partner, site coordinators work with teachers to develop strategies and routines, demonstrate the strategies and routines in teachers' classrooms, and then observe the teachers as they use the routines or strategies with their students. Site coordinators spend 18–27 days each year at each CLC school.

Coordinators also lead multiday summer training sessions with level 3 reading teachers on the reading curriculum and recommended instructional techniques that accompany the curriculum.2 The instructional practices and routines are consistent with the strategies and routines to be used by core content area teachers.

Finally, site coordinators and the schools' literacy leadership team choose teachers and staff who understand the framework and are enthusiastic about the intervention to become internal developers in schools and districts. These internal developers will help to sustain CLC once the evaluation project ends and expand it to higher grades and other schools.

1 The full version of the CLC includes two additional levels of support. Level 4 is intensive basic skills instruction that takes a team approach to reading for students having difficulty with foundational decoding, fluency, and comprehension skills. Level 5 includes therapeutic intervention by special education teachers and other support personnel for students with underlying language disorders.

2 Although CLC developed out of a separate research tradition, similarities between CLC's tiered levels of support and the response to intervention approach have prompted the Center for Research on Learning to refer to CLC as an application of response to intervention to secondary school level literacy (Deshler and Kovaleski 2007; Fuchs and Deshler 2007).

Study design

The study recruited high schools from eight Midwest Region school districts with characteristics associated with greater need for support: high schools with at least 100 grade 9 students, with 33 percent or more students eligible for free or reduced-price lunch, and with fewer than 75 percent of grade 10 or 11 students performing at proficient levels on standardized tests in reading or English language arts. The CLC cross-curriculum and tiered approach also had to differ substantially from other literacy and reform efforts in eligible schools. Finally, school and district leaders had to demonstrate a strong commitment to implement the intervention and facilitate the research team's data collection.

This two-year (2008–10) cluster randomized trial randomly assigned high schools in participating districts to either the intervention group (schools implementing CLC) or the control group (schools continuing with business as usual). The intervention is being phased into schools beginning with grade 9 teachers and students during year 1 (2008/09) and adding grade 10 teachers and the subsequent cohort of grade 9 students in year 2 (2009/10). With 33 high schools agreeing to participate, the study has sufficient power to detect effects equivalent to three to nine months of reading growth.3 The eight districts represent urban, suburban, and rural locales in Indiana (6 schools), Michigan (14 schools), Ohio (9 schools), and Wisconsin (4 schools). Schools are also diverse in background characteristics (table 1).

Table 1. Average characteristics of sample schools, 2005/06

School characteristic Intervention schools Control schools
Student demographics    
Total student population 1,387 1,307
Number of grade 9 students 481 479
Share of students eligible for free or reduced-price lunch (percent) 51 50
Share of students from a racial/ethnic minority (percent) 64 63
Student achievement    
Share of grade 11 English language arts students who are proficient (percent) 55 52
Share of grade 11 math students who are proficient (percent) 48 43
Number of schools making adequate yearly progress 8 5
Promotion powera (percent) 73 63
Sample size (number of schools) 17 16

Note: proficiency measures are administered on a state-by-state basis and are not necessarily comparable.

a Ratio of grade 12 students in 2005/06 to grade 9 students in 2002/03. Data to calculate promotion power were unavailable for two districts.

Source: Researchers' analysis based on school demographic data from the Common Core of Data (U.S. Department of Education 2009b), student achievement data from school report cards available on state education web sites, and promotion power calculated from Common Core of Data (U.S. Department of Education 2009a,b).

3 Depending on the explanatory power of the covariates in the analysis model, the study is powered to detect effects of between 0.07 and 0.25 standard deviation units on a standardized reading assessment. Hill et al. (2008) estimate that an effect size of 0.19 (+/- 0.04) standard deviation units is approximately the expected gain students would make from grade 9 to 10, and an effect size of 0.19 (+/- 0.16) standard deviation units from grade 10 to 11.

Key outcomes and measures

The primary outcomes are student reading skills and student achievement in core subject areas. Program impact will be assessed using student performance on the Group Reading Assessment Diagnostic Evaluation (GRADE) and on state achievement tests, course grades, and course-taking patterns. The contrast between GRADE scores in intervention and control schools will provide a direct test of impact on student literacy. Student transcripts and report card data from school records will be used to measure student achievement in core content area classes. Ancillary analyses will examine the effects of CLC on other student-level outcomes, such as performance on state assessments, attendance rates, and course-taking patterns. A secondary outcome is increased literacy-focused instruction in the classroom, and this instruction will be captured using a classroom observation tool that records the types and amount of reading-related activities in the classroom (such as prereading, vocabulary, text-level comprehension, and comprehension strategies). Observation data from teachers in intervention schools will be compared with observation data from teachers in control schools to determine whether teachers in intervention schools provide more literacy-focused instruction.

Data collection approach

Data will be collected from intervention and control schools on student reading performance on the GRADE and performance in core subjects on state achievement tests. Data will also be collected on course-taking patterns and course grades.

Other data collection activities include observations of classroom instruction and interviews with school and district leaders. Information from these sources will indicate possible changes in teacher instruction, distinctions in school and classroom context, the contrast between schools in each experimental condition, and fidelity of implementation (table 2).

Table 2. Data collection timeline

Outcome Data source Baseline Follow-up year 1, 2008/09 Follow-up year 2, 2009/10
Baseline data and student achievement in core subjects School records data Individual student demographic data Data from tests already administered in the districts Other student outcome data School years 2005/06, 2006/07, 2007/08; collected from district in quarter 4, 2008 and quarter 1, 2009 for prior years' tests Grade 9 student (cohort 1) data collected from district in quarter 4, 2009 Grade 9 (cohort 2) and grade 10 (cohort 1) student data collected from district in quarter 4, 2010
Student literacy Literacy assessment
Group Reading Assessment Diagnostic Evaluation (GRADE)
  Administered to grade 9 students (cohort 1) in spring 2009 Administered to grade 9 students (cohort 2) and grade 10 students (cohort 1) in spring 2010
Classroom instruction Observations of classroom instruction
Core subject area classes and reading classes observed at each school during site visits
  One observation visit of grade 9 classrooms per school in quarter 1, 2009; one observation visit of grade 9 classrooms in quarter 2, 2009 One grade 9 and 10 observation visit per school in quarter 4, 2009; one grade 9 and 10 observation visit in quarter 2, 2010

Source: Researchers' analysis.

Analysis plan

Student data will be analyzed using two-level hierarchical linear models, with level 1 being students and level 2 being schools. Students' grade 8 standardized test scores and demographic information will serve as covariates in student-level models, and average grade 8 test scores, school demographic data, and eligibility for free or reduced price lunch will serve as covariates in school-level analyses. Separate analyses will be conducted for each cohort. The analyses will focus on the second year of implementation to maximize the potential to observe program impacts. After two years of implementation, grade 9 teachers will have more experience with the elements and delivery of the CLC program (potentially producing stronger impacts on grade 9 students' academic outcomes). In addition, some grade 10 students will have received two years of CLC-related instruction and services (potentially benefitting from increased exposure to the program).

A classroom observation tool adopted for this study will be used to record the types and amount of reading-related activities that take place in the classroom. This information can help determine whether teachers in intervention schools provide more literacy-focused instruction.

By testing the impact of CLC using a cluster randomized trial, the research team will be able to determine whether the program—in isolation from other potentially contributing factors—causes improvements in student outcomes. Even with the rigorous design, there are limitations to the conclusions that can be drawn from the study, however. First, high schools were selected for the study based on student characteristics, past student achievement, and commitment to implement the CLC framework. Thus, findings from this evaluation may not generalize to schools with different characteristics. However, because the schools in the study represent the diversity of high schools in the Midwest Region, the findings will likely be relevant to a variety of policymakers and practitioners. Second, while the study expects to see initial impacts from the CLC program on student outcomes from two years of implementation (2008–10), initiatives that target whole-school change often require longer implementation to realize their full potential. Thus, any initial findings of impact or lack of impact may not represent the full extent of change possible with the CLC program.

Principal investigators

William Corrin, MDRC, and James Lindsay, Learning Point Associates.

Contact information

James Lindsay
Regional Educational Laboratory Midwest at Learning Point Associates
1120 East Diehl Road, Suite 200
Naperville, IL 60563
Phone: (630) 649-6591
Fax: (630) 649-6700
Email: jim.lindsay@learningpt.org

Region: Midwest

References

Corrin, W., Somers, M-A., Kemple, J., Nelson, E., and Sepanik, S. (2008). The Enhanced reading opportunities study: findings from the second year of implementation (NCEE 2009-4036). Washington, DC: National Center for Educational Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Deshler, D. D., and Kovaleski, J. F. (2007, December). Secondary applications of RTI: a guided discussion. Presentation at the U.S. Department of Education and National Association of Elementary School Principals symposium Response to Intervention: Improving Achievement for ALL Students Summit, December 6-7, Crystal City, VA. Available at www.nrcld.org/about/presentations/2007/DDSummitNotes.pdf.

Deshler, D. D., and Schumaker, J. B. (2006). Teaching adolescents with disabilities: accessing the general education curriculum. New York: Corwin Press.

Fuchs, D. and Deshler, D. D. (2007). What we need to know about responsiveness-to-intervention (and shouldn't be afraid to ask). Learning disability research and practice,22(2), 129–36.

Grigg, W., Donahue, P., and Dion, G. (2007). The nation's report card: 12th-grade reading and mathematics 2005 (NCES 2007-468). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Hill, C. J., Bloom, H. S., Black, A. R., and Lipsey, M. W. (2008). Empirical benchmarks for interpreting effect sizes in research. Child Development Perspectives, 2(3), 172–77.

Lee, J., Grigg, W., and Donahue, P. (2007). The nation's report card: reading 2007 (NCES 2007-496). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

Schumaker, J. B., and Deshler, D. D. (2003). Designs for applied educational research. In H. L. Swanson, K. R. Harris and S. Graham (Eds.), Handbook of learning disabilities (pp. 482-495). New York: Guilford.

U. S. Department of Education, National Center for Education Statistics. (2009a). Public Elementary/Secondary School Universe Survey Data: 2002–2003 (Common Core of Data). Washington, DC: U. S. Department of Education, National Center for Education Statistics.

U. S. Department of Education, National Center for Education Statistics. (2009b). Public Elementary/Secondary School Universe Survey Data: 2005–2006 (Common Core of Data). Washington, DC: U. S. Department of Education, National Center for Education Statistics.

Return to Index