
Understanding the Effect of KIPP as It Scales: Volume I, Impacts on Achievement and Other Outcomes. Final Report of KIPP's "Investing in Innovation Grant Evaluation"
Tuttle, Christina Clark; Gleason, Philip; Knechtel, Virginia; Nichols-Barrer, Ira; Booker, Kevin; Chojnacki, Gregory; Coen, Thomas; Goble, Lisbeth (2015). Mathematica Policy Research, Inc. Retrieved from: https://eric.ed.gov/?id=ED560079
-
examining14,551Students, grades6-8
Knowledge is Power Program (KIPP) Intervention Report - Charter Schools
Review Details
Reviewed: January 2018
- Quasi-Experimental Design
- Meets WWC standards with reservations because it uses a quasi-experimental design in which the analytic intervention and comparison groups satisfy the baseline equivalence requirement.
This review may not reflect the full body of research evidence for this intervention.
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Please see the WWC summary of evidence for Knowledge is Power Program (KIPP).
Findings
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Statewide assessment of reading achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.08 |
-0.09 |
Yes |
|
|
|
Show Supplemental Findings | |||||||||
End-of-Year Reading Assessments |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
Statewide assessment of reading achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.06 |
-0.09 |
Yes |
|
||
End-of-Year Reading Assessments |
Knowledge is Power Program (KIPP) vs. Business as usual |
2 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
End-of-Year Reading Assessments |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
Statewide assessment of reading achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
2 Years |
Tuttle 2015 Middle school: matched-student sample;
|
-0.01 |
-0.11 |
Yes |
|
||
Statewide assessment of reading achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2015 Middle school: matched-student sample (new KIPP middle schools);
|
-0.22 |
-0.27 |
Yes |
|
||
Reading test score |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2013 Full matched comparison sample;
|
N/A |
N/A |
Yes |
|
||
Statewide assessment of reading achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2015 Middle school: matched-student sample;
|
-0.11 |
-0.11 |
No |
-- |
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Statewide mathematics assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.14 |
-0.13 |
Yes |
|
|
|
Show Supplemental Findings | |||||||||
Achievement in mathematics |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
Achievement in mathematics |
Knowledge is Power Program (KIPP) vs. Business as usual |
2 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
Achievement in mathematics |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2010 Full sample;
|
N/A |
N/A |
Yes |
|
||
Statewide mathematics assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.17 |
-0.12 |
Yes |
|
||
Statewide mathematics assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
2 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.09 |
-0.14 |
Yes |
|
||
Math test score |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2013 Full matched comparison sample;
|
N/A |
N/A |
Yes |
|
||
Statewide mathematics assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2015 Middle school: matched-student sample;
|
-0.05 |
-0.11 |
Yes |
|
||
Statewide mathematics assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
1 Year |
Tuttle 2015 Middle school: matched-student sample (new KIPP middle schools);
|
-0.19 |
-0.23 |
No |
-- |
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Statewide science assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.08 |
-0.17 |
Yes |
|
|
|
Show Supplemental Findings | |||||||||
Science |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2013 full matched comparison sample;
|
N/A |
N/A |
Yes |
|
Outcome measure |
Comparison | Period | Sample |
Intervention mean |
Comparison mean |
Significant? |
Improvement index |
Evidence tier |
|
---|---|---|---|---|---|---|---|---|---|
Statewide assessments of history achievement (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
4 Years |
Tuttle 2015 Middle school: matched-student sample;
|
0.11 |
-0.13 |
Yes |
|
|
|
Show Supplemental Findings | |||||||||
Statewide social studies assessments (z-score) |
Knowledge is Power Program (KIPP) vs. Business as usual |
3 Years |
Tuttle 2013 Full matched comparison sample;
|
N/A |
N/A |
Yes |
|
Evidence Tier rating based solely on this study. This intervention may achieve a higher tier when combined with the full body of evidence.
Sample Characteristics
Characteristics of study sample as reported by study author.
-
7% English language learners -
Female: 51%
Male: 49% -
Urban
-
- B
- A
- C
- D
- E
- F
- G
- I
- H
- J
- K
- L
- P
- M
- N
- O
- Q
- R
- S
- V
- U
- T
- W
- X
- Z
- Y
- a
- h
- i
- b
- d
- e
- f
- c
- g
- j
- k
- l
- m
- n
- o
- p
- q
- r
- s
- t
- u
- v
- x
- w
- y
Arkansas, Colorado, District of Columbia, Georgia, Massachusetts, North Carolina, New York, Tennessee, Texas
-
Race Black 58% Other or unknown 42% -
Ethnicity Hispanic 39% Not Hispanic or Latino 61%
Study Details
Setting
This analysis includes students and schools in multiple states and districts in the United States where KIPP charter schools operate. The study took place in 43 middle schools in the KIPP network in 20 cities across the following 12 states and the District of Columbia: Arkansas, California, Colorado, Georgia, Maryland, Massachusetts, New Jersey, New York, North Carolina, Pennsylvania, Tennessee, and Texas.
Study sample
The study used a matched-student quasi-experimental design, where the intervention group consisted of students who attended 37 KIPP middle schools, and the comparison group was a sample matched based on student baseline characteristics: baseline reading and math test scores; gender, race, special education, limited English proficiency, and free or reduced-price lunch status; and whether the student repeated a grade in the baseline year. Sample characteristics for the analysis samples with non-imputed baseline data, on which the WWC based the intervention’s effectiveness rating, are not reported.
Intervention Group
Students in the intervention condition attended a KIPP middle school at some point over the period 2001–13.
Comparison Group
Students in the comparison condition attended non-KIPP middle schools.
Support for implementation
The study did not provide information about implementation support; however, authors noted that staff at KIPP schools had considerable autonomy in the implementation process to set the direction of the school (p. 22).
Additional Sources
In the case of multiple manuscripts that report on one study, the WWC selects one manuscript as the primary citation and lists other manuscripts that describe the study as additional sources.
-
Tuttle, Christina Clark; Gleason, Philip; Knechtel, Virginia; Nichols-Barrer, Ira; Booker, Kevin; Chojnacki, Gregory; Coen, Thomas; Goble, Lisbeth. (2015). Going to Scale: As KIPP Network Grows, Positive Impacts Are Sustained. In Focus. Mathematica Policy Research, Inc.
-
Gleason, Philip M.; Tuttle, Christina Clark; Gill, Brian; Nichols-Barrer, Ira; Teh, Bing-ru. (2014). Do KIPP Schools Boost Student Achievement?. Education Finance and Policy, v9 n1 p36-58.
-
Tuttle, Christina Clark; Teh, Bing-ru; Nichols-Barrer, Ira; Gill, Brian P.; Gleason, Philip. (2010). Supplemental Analytic Sample Equivalence Tables for Student Characteristics and Achievement in 22 KIPP Middle Schools: A Report from the National Evaluation of KIPP Middle Schools. Mathematica Policy Research, Inc.
-
Tuttle, Christina Clark; Teh, Bing-ru; Nichols-Barrer, Ira; Gill, Brian P.; Gleason, Philip. (2010). Student Characteristics and Achievement in 22 KIPP Middle Schools: Final Report. Mathematica Policy Research, Inc.
-
Tuttle, Christina Clark; Gill, Brian; Gleason, Philip; Knechtel, Virginia; Nichols-Barrer, Ira; Resch, Alexandra. (2013). KIPP Middle Schools: Impacts on Achievement and Other Outcomes. Final Report. Mathematica Policy Research, Inc.
An indicator of the effect of the intervention, the improvement index can be interpreted as the expected change in percentile rank for an average comparison group student if that student had received the intervention.
For more, please see the WWC Glossary entry for improvement index.
An outcome is the knowledge, skills, and attitudes that are attained as a result of an activity. An outcome measures is an instrument, device, or method that provides data on the outcome.
A finding that is included in the effectiveness rating. Excluded findings may include subgroups and subscales.
The sample on which the analysis was conducted.
The group to which the intervention group is compared, which may include a different intervention, business as usual, or no services.
The timing of the post-intervention outcome measure.
The number of students included in the analysis.
The mean score of students in the intervention group.
The mean score of students in the comparison group.
The WWC considers a finding to be statistically significant if the likelihood that the finding is due to chance alone, rather than a real difference, is less than five percent.
The WWC reviews studies for WWC products, Department of Education grant competitions, and IES performance measures.
The name and version of the document used to guide the review of the study.
The version of the WWC design standards used to guide the review of the study.
The result of the WWC assessment of the study. The rating is based on the strength of evidence of the effectiveness of the intervention. Studies are given a rating of Meets WWC Design Standards without Reservations, Meets WWC Design Standards with Reservations, or >Does Not Meet WWC Design Standards.
A related publication that was reviewed alongside the main study of interest.
Study findings for this report.
Based on the direction, magnitude, and statistical significance of the findings within a domain, the WWC characterizes the findings from a study as one of the following: statistically significant positive effects, substantively important positive effects, indeterminate effects, substantively important negative effects, and statistically significant negative effects. For more, please see the WWC Handbook.
The WWC may review studies for multiple purposes, including different reports and re-reviews using updated standards. Each WWC review of this study is listed in the dropdown. Details on any review may be accessed by making a selection from the drop down list.
Tier 1 Strong indicates strong evidence of effectiveness,
Tier 2 Moderate indicates moderate evidence of effectiveness, and
Tier 3 Promising indicates promising evidence of effectiveness,
as defined in the
non-regulatory guidance for ESSA
and the regulations for ED discretionary grants (EDGAR Part 77).