Skip Navigation
The Evaluation of Enhanced Academic Instruction in After-School Programs
NCEE 2009-4077
September 2009

Findings for the Math Program

As mentioned earlier, the math findings presented in this report pertain to the 15 centers that participated in two years of program operations and data collection.

Implementation of the Enhanced After-School Math Program

Overall, the enhanced math program was largely implemented as intended in both years of program operations. Each center was expected to hire certified teachers and to operate with 10 students per instructor. In the first year, for example, 98 percent of instructors were certified teachers, and the programs operated with the intended small groups of students — on average, in the first year, eight students attended per instructor. The goal was to offer the program for approximately 180 minutes per week, and average offerings were 189 minutes in the first year (a statistically significantly greater amount than intended, p-value = 0.00) and 171 minutes in the second (which does not statistically differ from the amount intended, p-value = 0.45). Instructors were trained by Harcourt staff at the beginning of the year and were provided ongoing assistance.5 They also received paid preparation time.

Impacts from Offering One Year of the Enhanced Math Program

The impact of enrollment in one year of the enhanced math program on student outcomes is estimated by comparing the outcomes of students who were randomly assigned to enroll in the enhanced after-school math program for one school year with the outcomes of students who were randomly assigned to remain in the regular after-school program during that same school year.6 This is estimated separately for each implementation year (Cohorts 1 and 2).

On average, students in the enhanced program group in Cohort 1 received 48 more hours of academic instruction in math during the school year than students in the regular program group. This difference — which is statistically significant (p-value = 0.00) — represents an estimated 30 percent increase in total math instruction over and above what is received by these students during the regular school day. In Cohort 2, enhanced program students received 42 more hours — also a statistically significantly greater amount of time (p-value = 0.00) than received by those in the regular program group, and an estimated 26 percent increase in total math instruction. However, the added hours of math instruction was statistically smaller in the second year of implementation (42 hours) than in the first year of implementation (48 hours) (p-value = 0.00).

One year of enrollment in the enhanced after-school program had a positive and statistically significant impact on students’ math achievement in Cohort 1 (3.5 scaled score points or 0.09 standard deviation) as measured by SAT 10 total math scores. This statistically significant impact represents a 10 percent improvement over what students in the enhanced group would have achieved had they not had access to the enhanced program, or about one month’s extra learning over the course of a nine-month school year. The estimated impact of the enhanced math program on SAT 10 total math scores is not statistically significant for students in the second year of implementation (p-value = 0.07). However, the difference in impacts between implementation years (Cohort 1 and Cohort 2 samples) is not statistically significant. Thus, it cannot be concluded that the enhanced after-school math program was more effective in one implementation year than the other.

One year of enrollment in the enhanced math program also had a positive and statistically significant impact on students’ performance on locally administered standardized math tests for Cohort 2 (0.18 standard deviation, p-value = 0.01), and the difference in one-year impacts across cohorts is not statistically significant (p-value = 0.16), so it cannot be concluded that the impact of the enhanced program on locally administered tests differed from one implementation year to the other. However, one year of enrollment did not produce impacts on regular-school-day teacher reports of academic behaviors (homework completion, attentiveness in class, and disruptiveness in class).

Impacts from Offering Two Years of the Enhanced Math Program

The impact of offering students the opportunity to participate in the enhanced program for two consecutive years is estimated using the two-year sample by comparing the outcomes of students who were randomly assigned to either the enhanced after-school program or the regular after-school program for two consecutive school years.7 However, as mentioned above, to maintain the experimental design, all Cohort 1 students were randomly assigned — both those Cohort 1 students who reapplied in the second year (applicants) and those Cohort 1 students who did not (nonapplicants). Thus, 42 percent of students in the math sample who were offered two years of the enhanced program did not reapply for, and did not receive, the second year of the program services. Hence, the impact findings presented in this section are of a two-year offer of services (an intent-to-treat analysis), rather than the impact of receipt of two years of the enhanced program — a nonexperimental analysis that is discussed later in this summary.

The estimated impact of offering students the opportunity to participate in the enhanced after-school program for two consecutive years is not statistically significant (2.0 scaled score points on the SAT 10 total score, p-value = 0.52). To place these results into context, the impact of these students’ first year in the enhanced program was also estimated and compared to their cumulative two-year impact. Their first-year impact is not statistically significant (5.2 scaled score points, p-value = 0.07). And the estimated impact of assigning students to two years of enhanced services is not statistically different from the impact on these students of their first year of access to the program (p-value = 0.28). Hence, for this sample, there is no evidence that offering the enhanced math instruction a second year provides an added benefit.

Figure ES.2 places these impact estimates in the context of the actual and expected two-year achievement growth of students in the enhanced program group. It shows the two-year growth for students in the enhanced program and what their expected growth would have been had they been assigned to the regular program. It also shows the test score growth for a nationally representative sample of students. The test scores of students in the enhanced program group grew 66.3 points over the two years (44.5 points in the first and 21.8 points in the second). Test scores of students in the regular program group grew by 64.3 points (39.4 points in the first year and 24.9 points in the second). These growth rates for the two program groups produce the estimated (not statistically significant) impacts mentioned above, a five-point difference in test scores for this sample after one year and a two-point difference after two years.

Because not all students in the enhanced program group actually received a second year of enhanced services, a nonexperimental analysis was conducted to examine whether longer exposure to the enhanced program is associated with improved math achievement. This analysis is based on instrumental variables estimation, which makes it possible to statistically adjust for the 42 percent of students in the enhanced program group who never attended the enhanced program in the second year. These findings do not establish causal inferences and thus should be viewed as hypothesis-generating. However, such an analysis may help with interpreting the two-year impacts and provide useful information to program developers.

The findings from this nonexperimental analysis suggest that there is no additional benefit to a second year of enhanced services, even after adjustments are made for students who did not attend a second year. The nonexperimental estimate of receiving two years of enhanced after-school services (3.7 scaled score points for SAT 10 total math scores, p-value = 0.36) does not statistically differ from the 5.2 scaled score points estimated impact of one year of enhanced services (p-value = 0.40). Thus, across both the experimental and nonexperimental analyses, there is no evidence that a second year of the enhanced program — whether offered or received — improves math achievement, over and above the gains produced by the first year of enrollment.

Because the effectiveness of enhanced after-school instruction may be related to factors associated with program implementation or what the students experience during the regular school day, the study also examined whether characteristics of schools and program implementation are correlated with center-level impacts. The analysis is based on center-level impacts in both years of the study (i.e., 30 center-level impacts) and examines whether the impact of one year of enhanced services on SAT 10 total math scores in each after-school center is associated with (1) the characteristics of the school that housed the after-school center and (2) the characteristics of a center’s implementation of the enhanced program.

Though center-level program impacts on total math scores are correlated jointly with the overall set of school context and implementation measures included in the analysis, as well as with some individual measures, no clear lessons emerge for program operations. Program impacts were larger in after-school centers that offered the enhanced program for a greater number of days during the school year, suggesting a positive association between impacts and program dosage. However, this finding is inconsistent with the nonexperimental estimates of two versus one year of enhanced program participation. Program impacts were also larger in centers where one or more teachers left the enhanced program during the school year and in schools that made their Adequate Yearly Progress goals. With the available information, it is not possible to explain the reason for these relationships.

Top

5 Enhanced math program staff received two full days of upfront training on how to use the math materials, including feedback from the developers in practice sessions using the materials. Ongoing support given to the enhanced program staff consisted of multiple on-site technical assistance visits (in the first year by Harcourt and Bloom Associates and in the second year by Bloom Associates) and continued support by locally based staff.
6 Referring back to Figure ES.1, the analysis compared E1 versus R1 in the Cohort 1 sample and, in the Cohort 2 sample, R1E2 versus R1R2 (applicants who had not received the program in the first year) and N1E2 versus N1R2 (new students in the second year). An overall F-test indicates there is no systematic difference in the baseline characteristics of students in the enhanced and regular program groups in either of the cohort-specific samples.
7 An overall F-test indicates there is no systematic difference in the baseline characteristics of students in the enhanced and regular program groups in the two-year sample.
8 Ongoing support given to the enhanced program staff consisted of multiple on-site technical assistance visits (by SFA and Bloom Associates) and continued support by locally based staff.