Skip Navigation
The Evaluation of Enhanced Academic Instruction in After-School Programs

NCEE 2008-4021
June 2008

Early Findings for Math

In the first year of the study, Mathletics, the math model put in place in 25 after-school centers, had the following findings:

  • The enhanced math program was implemented as intended (in terms of staff characteristics, training, and usage of instructional materials).
  • Students received an average of 179 minutes of math instruction per week.
  • Math instructors reported that the intended pace of the daily lesson plan was easy to follow.
  • The enhanced program provided students with 30 percent more hours of math instruction over the school year, compared with students in the regular after-school program group.
  • There are positive and statistically significant impacts for the enhanced math program on student achievement, representing 8.5 percent more growth over the school year for students in the enhanced program group, as measured by the SAT 10 total math score.
  • The math program did not produce statistically significant impacts (either positive or negative) on any of the three school-day academic behavior measures: student engagement, behavior, or homework completion.

Implementation of the Enhanced Math Program

Overall, the enhanced math program was implemented as intended in the 25 centers. Each center was expected to hire four certified teachers and to operate with 10 students per instructor. Across the 25 centers, 97 percent of staff were certified, and the programs operated with the intended small groups of students — on average, 9 students per instructor. Staff were trained by Harcourt staff at the beginning of the year and were provided ongoing assistance.5 They also received paid preparation time. Structured protocol observations of implementation of after-school classes conducted by local district coordinators indicate that 93 percent of observed classes covered the intended content and used the recommended instructional strategies and kept pace with the daily lesson schedule.

The Service Contrast in Math

Students in the enhanced math program were offered and attended a different set of services during the after-school academic time slots than the regular program group.

The enhanced program offered its students academic instruction in math, whereas 15 percent of students in the regular after-school program group were offered academic instruction in math, and the other students received primarily homework help and/or tutoring on multiple subjects. Ninety-seven percent of staff members providing the instruction to the enhanced group students were certified teachers, compared with 62 percent of the regular after-school program staff. Additionally, 94 percent of enhanced program staff received upfront training, and 95 percent received ongoing support, compared with 55 percent and 70 percent of the regular program staff, respectively. These differences are statistically significant at the 0.05 level.

Students in the enhanced program group attended, on average, 49 more hours of academic instruction in math over the course of the school year than the regular program group received (57.17 hours, compared with 8.57 hours). Given estimates of average school-day instruction, this represents an estimated 30 percent more hours of math instruction for students in the enhanced program. Students in the enhanced program group attended 20 percent more days than those in the regular after-school program group, and this difference is statistically significant (effect size = 0.38).

Impacts of the Enhanced Math Program

The main objective of the enhanced after-school math program is to improve student academic performance in math. The analysis looks at impacts on all students in the sample, as well as impacts on two sets of subgroups: students in the two lower grades (second and third) separately from those in the higher grades (fourth and fifth) and students who came to the program with higher levels of prior achievement in math separately from those with lower preintervention achievement levels as defined by SAT 10 performance standards of “below basic,” “basic,” and “proficient.”6

Access to the enhanced academic after-school math program improved the math performance of students, on average, as measured by the SAT 10, and this finding is statistically significant. In the absence of the intervention, students would have improved their average total math test score by 33.0 scaled score points over the school year.7 With the intervention, the enhanced program group was able to increase its average test score by 35.8 scaled score points. Therefore, the estimated difference between the enhanced and the regular after-school math program groups is 2.8 scaled score points (effect size = 0.06),8 which reflects an 8.5 percent difference in growth. Figure ES.1 illustrates this impact.

These statistically significant math impacts are also present across multiple subtests and subgroups. The average scores on the math subtests — problem-solving and procedures — for the enhanced program group are 2.5 scaled score points higher (effect size = 0.05) and 4.3 scaled score points higher (effect size = 0.08), respectively, than the average scores of the regular program group students.

The impact in total math scores from the program for the fourth- and fifth-grade subgroup is 3.9 scaled score points and is statistically significant. For second- and third-graders, the impact is not statistically significant (1.8 scaled score points), although the impacts for the higher and lower grades could not be statistically distinguished. Similarly, the impacts for the priorachievement subgroups (below basic, basic, and proficient) could not be statistically distinguished. The program impacts on total math scores are 2.9 scaled score points (effect size = 0.06) for the below-basic group; 3.3 scaled score points (effect size = 0.07) for the basic group; and 3.0 scaled score points (effect size = 0.07) for the proficient group. All but the estimate for the basic group (which is approximately half the sample) are not statistically significant.

The analysis also looks at impacts on three measures of student academic behavior — How often do they not complete homework? How often are they attentive in class? How often are they disruptive in class? — for all students in the sample as well as for the two sets of subgroups. Contrary to concerns that the instruction could “overload” students with its academic focus, the findings suggest that enrollment in the enhanced math program did not adversely affect homework completion or the two classroom behavior measures for the full analysis sample or for any of the subgroups, nor did it lead to statistically significant differences in these measures for the enhanced versus the regular program group.

Linking Local School Context to Math Impacts

While the average impact on math test scores is 2.8 scaled score points, not all 25 centers in the study sample experienced this exact difference. Though the study was not designed with the power to detect impacts at the level of individual centers, 17 of the 25 centers did have positive point estimates of Mathletics impacts; 8 of 25 had negative point estimates. Thus, the analysis explored the possibility of variation in impacts for students who attended different types of schools and experienced different program implementation.

Because the effectiveness of after-school instruction may be related to factors associated with program implementation or what the students experience during the regular school day, a correlational analysis examined the moderating effects of school characteristics and factors of program implementation. It is worth emphasizing that this analysis is nonexperimental and exploratory. Thus, the inference that a particular factor caused or did not cause the impact to be larger or smaller cannot be determined. For example, there could exist factors unaccounted for in the analysis that are correlated with both the program impact and certain school characteristics and that thus account for an observed relationship.

Nonetheless, this analysis uses a regression framework to link program impacts to the following school characteristics: the hours of in-school instruction in the relevant subject, the similarity of the in-school curriculum to the intervention materials, whether the school met its Adequate Yearly Progress (AYP) goals, the proportion of students receiving free or reducedprice lunch, and the in-school student-to-teacher ratio. The analysis also links impacts to two factors of program implementation: the number of days over the course of the school year that the enhanced math program was offered and whether a teacher from the enhanced program left during the school year. Specifically, a regression with interactions between the treatment indicator and each of these school characteristics and factors of program implementation is run to examine how the program impact is moderated by these variables. A chi-square test indicates that, overall, this set of school and implementation characteristics is associated with program impacts on the total math SAT 10 score (p-value = 0.05).

A t-test from the regression analysis shows that, controlling for all these characteristics, centers meeting AYP goals are associated with a higher program impact (p-value = 0.01). Centers serving schools that employ a direct instructional approach organized by lessons with a spiraled curriculum experience lower program impacts than centers that employ a curriculum similar to Mathletics (p-value = 0.03). With the available information, it is not possible to explain the reasons for these relationships.

Finally, individual t-tests from the regression analysis indicate that none of the other measures has a statistically significant relationship to the impacts of the enhanced math program.

Top


5 Enhanced math program staff received two full days of upfront training on how to use the math materials, including feedback from the developers in practice sessions using the materials. Ongoing support given to the enhanced program staff consisted of multiple on-site technical assistance visits (an average of three), continued support by locally based staff, and daily paid preparation time of 30 minutes.
6 The performance standards are available as part of the SAT 10 scoring. The cut points are criterionreferenced scores. The cuts are created by a panel of teachers based on what they feel a student should be able to do at a particular level of proficiency.
7 A “scaled score” is a conversion of a student’s raw score on a test to a common scale that allows for numerical comparison between students across different forms and levels of the test. The fall-to-spring growth in test scores for the control group (33 scaled score points, based on the abbreviated SAT 10 test) was bigger than the weighted average growth for students in grades 2 through 5 in a nationally representative sample (18 scaled score points, based on the full-length SAT 10 test). Compared with the national sample, both the enhanced program group and the regular program group in the study sample have a higher proportion of low-performing students. (In the math program sample, 78 percent of the students were performing below proficient in math at the beginning of the program.)
8 “Effect size,” which is used widely for measuring the impacts of educational programs, is defined as the impact estimate divided by the underlying population’s standard deviation of the outcome measure; effect size is usually measured by the control group’s standard deviation.