|Title:||Effective Early Childhood Education Programs: Meta-Analytic Lessons from High Quality Program Evaluations|
|Principal Investigator:||Yoshikawa, Hirokazu||Awardee:||President and Fellows of Harvard College, Graduate School of Education|
|Program:||Early Learning Programs and Policies [Program Details]|
|Award Period:||2 years||Award Amount:||$699,881|
Co-Principal Investigators: Greg Duncan (University of California, Irvine), Katherine Magnuson (University of Wisconsin-Madison), and Holly S. Schindler, (Harvard University)
Purpose: The goal of this project is to identify malleable early childhood education program characteristics associated with child achievement, cognition, antisocial behavior, and positive behavior. Researchers propose to use meta-analytic and regression-based methods to answer three key questions: (1) What structural characteristics of early education programs are associated with larger program impacts on achievement, cognition, antisocial behavior and positive behavior? (2) Is the addition of parent-focused services to early childhood education programs associated with larger program impacts on children? If so, what types of parent-focused services are associated with the largest added benefits? (3) How are starting age, duration and length of follow-up in early childhood program evaluations associated with size of effects on children's achievement, cognition, antisocial behavior and positive behavior? Across all three of the research questions, researchers will consider population characteristics that might moderate associations between program characteristics and effects on children race/ethnicity and family income (particularly poverty status).
Project Activities: The project team will use meta-analytic techniques to review and analyze prior study results. Meta-analysis, a method of quantitative research synthesis, uses prior study results as the unit of observation. Researchers will draw on a comprehensive meta-analytic database of early childhood education evaluations that has been compiled by researchers at the Harvard Center on the Developing Child, the University of Wisconsin-Madison, the University of California, Irvine, and Johns Hopkins University. Researchers will be assessing reports of evaluations of center-based early childhood education programs. These programs vary considerably in their program design. Researchers will identify variations in program design in order to determine the extent to which program features shape children's school readiness outcomes.
Products: Products include published reports that identify features of early childhood education programs that maximize benefits to children's development.
Setting: This project is a meta-analysis of evaluation studies of early childhood education programs conducted in the United States and its territories between 1960 and 2007.
Population: Participants include children enrolled in early childhood education programs between the ages of 3 and 5 (inclusive) and their control-group counterparts.
Intervention: Not applicable.
Research Methods and Design: Researchers will use meta-analysis, a method of quantitative research synthesis that uses prior study results as the unit of observation. Researchers will draw on a comprehensive meta-analytic database of early childhood education evaluations that has been compiled by researchers at the Harvard Center on the Developing Child, the University of Wisconsin-Madison, the University of California-Irvine, and Johns Hopkins University. Several prior meta-analyses have examined preschool education evaluations. Researchers differentiate this study from earlier meta-analytic efforts in several respects. First, there is a more specific developmental focus. Researchers will distinguish between cognitive outcomes based on their sensitivity to classroom instruction and consider behavioral outcomes separately. Second, researchers screened more candidate studies than prior meta-analyses and included more updated research. Third, researchers will employ rigorous methodological inclusion criteria and develop specific inclusion criteria for different kinds of quasi- or non-experimental studies (e.g., fixed-effects; difference-in-difference; propensity score and other kinds of matching; regression discontinuity; and some other regression-based approaches). Fourth, the project team will not simply average effect sizes within studies, but also retain effect size information in multi-level models and weight for both number of tests within studies and precision of estimates. Finally, researchers will utilize a broad set of robustness analyses. Although the three research questions have differing emphases, they share overlapping data and analytic models. In particular, researchers will work closely to ensure that analyses are coordinated, and findings for one question are incorporated into the analyses related to the other questions.
Key Measures: Key outcomes are comprised of early childhood program impacts on several domains of children's school readiness, including child cognition, achievement, antisocial behavior, and positive behavior. The outcome information is presented in the reports using a number of different statistics. To combine findings across studies in meta-analysis, estimates are transformed into a common metric, an "effect size," which expresses treatment/control differences as a fraction of the standard deviation. Hedges g, an effect size statistic that makes an adjustment to the standardized mean difference to account for bias when sample sizes are small, will be used in the analyses.
Data Analytic Strategy: Researchers will combine meta-analytic methods with Hierarchical Linear Modeling (HLM) multi-level regressions, which are used to account for the nested nature of the effect-size data (i.e., effect sizes are clustered within contrast).
Publications from this project:
Shager, H.M., Schindler, H.S., Magnuson, K.A., Duncan, G.J., Yoshikawa, H., and Hart, C.D. (2013). Can Research Design Explain Variation In Head Start Research Results? A Meta-Analysis Of Cognitive and Achievement Outcomes. Educational Evaluation and Policy Analysis, 35 (1): 76–95.