IES Blog

Institute of Education Sciences

Every Transition Counts for Students in Foster Care

EDITOR’S NOTE: The Institute of Education Science funds and supports Researcher-Practitioner Partnerships (RPP) that seek to address significant challenges in education. In this guest blog post, Elysia Clemens (pictured left), of the University of Northern Colorado, and Judith Martinez (pictured right), of the Colorado Department of Education, describe the work that their IES-funded RPP is doing to better understand and improve outcomes for students in foster care.

May is Foster Care Awareness Month and 2017 is an important year for raising awareness of the educational outcomes and educational stability of students in foster care.

With passage of the Every Student Succeeds Act of 2015 (ESSA), provisions are now in place for states to report on the academic performance and status of students in foster care. ESSA also requires collaboration between child welfare and education agencies to ensure the educational stability (PDF) of students while they are in foster care. This includes reducing the number of school changes to those that are in a student’s best interest and ensuring smooth transitions when changing schools is necessary.

To address the need for baseline data on how students in foster care are faring academically, the University of Northern Colorado, the Colorado Department of Education, and the Colorado Department of Human Services formed a researcher-practitioner partnership in 2014. This IES-funded partnership is currently researching the connection between child welfare placement changes and school changes and how that relates to the academic success of students.

Our goals are to raise awareness of gaps in academic achievement and educational attainment, inform the application of educational stability research findings to the implementation of ESSA’s foster care provisions, and develop and maintain high-quality data that can be easily accessed and used.

Achievement and educational attainment

Until recently, Colorado students in foster care were not identified in education data sets, and child welfare agencies did not always know how the youth in their care were faring in school. The Colorado partnership linked child welfare and education data from 2008 forward and found that across school years, grade levels, and subject areas, there is an academic achievement gap of at least 20 percentage points between students in foster care and their peers (see chart from the partnership website below).

The most critical subject area was mathematics, where the proportion of students scoring in the lowest proficiency category increased with each grade level. The data also revealed that less than one in three Colorado students who experience foster care graduate with their class.


Source: The Colorado Study of Students in Foster Care (http://www.unco.edu/cebs/foster-care-research/needs-assessment-data/academic-achievement/)


Like many states, Colorado has a long way to go toward closing academic achievement gaps for students in foster care, but with the availability of better data, there is a growing interest in the educational success of these students statewide.  

Educational Stability

Educational stability provisions, such as the ones in ESSA, are designed to reduce barriers to students’ progress, such as unnecessary school moves, gaps in enrollment, and delays in the transfer of records. To estimate how much implementation of these provisions might help improve educational stability for students in foster care, we used child welfare placement dates and school move dates to determine the proportion of school moves associated with changes in child welfare placements. A five-year analysis of school moves before, during, and after foster care placements revealed that the educational stability provisions in the ESSA would apply to two-thirds of the school moves Colorado students experienced.

To fully realize this policy opportunity, we began by generating heat maps on where foster student transfers occur (an example is pictured to the right). These geographical data are being used by Colorado Department of Education and Colorado Department of Human Services to prioritize relationship-building among specific local education agencies and child welfare agencies. Regional meetings are being held to strengthen local collaboration in implementing ESSA’s mandates regarding educational stability and transportation plans.

We also summarized the frequency of school moves by the type of child welfare placement change (e.g., entry into care, transitions among different types of out-of-home placements). We found that nearly one-third of Colorado students who enter foster care also move schools at the same time. This finding can help child welfare and education agencies anticipate the need for short-term transportation solutions and develop procedures for quickly convening stakeholders to determine if a school move is in a child’s best interest.

Accessible and Usable Data

A key communication strategy of the Colorado partnership is to make the descriptive data and research findings accessible and actionable on our project website. The data and findings are organized with different audiences in mind, so that advocates, practitioners, grant writers, and policy makers can use this information for their own distinct purposes. 

The website includes infographics that provide an overview of the data and recommendations on how to close gaps; dynamic visualizations that allow users to explore the data in-depth; and reports that inform conversations and decisions about how to best serve students in foster care.

In our final year of this IES RPP grant, we will continue to identify opportunities to apply our research to inform the development of quality transportation plans and local agreements. We also will study how the interplay between the child welfare placement changes relates to academic progress and academic growth.

 

IES Grantees Receive SRCD Distinguished Scientific Contributions Award

Two Institute of Education Sciences (IES) grantees were recently recognized by the Society for Research in Child Development (SRCD) for their lifetime contributions to the knowledge and understanding of child development.

Roberta Golinkoff and Kathy Hirsh-Pasek received SRCD’s Distinguished Scientific Contributions to Child Development Award in April. It is the first time a team received the award. Dr. Golinkoff (pictured, right) is the Unidel H. Rodney Sharp Chair in the School of Education at University of Delaware and Dr. Hirsh-Pasek (pictured, left) is the Stanley and Debra Lefkowitz Faculty Fellow in the Department of Psychology at Temple University and a Senior Fellow at the Brookings Institution.

The duo has been collaborating on research in a variety of areas of young children’s development and education for several decades, including pioneering work in language, spatial development, and learning through play. They have also dedicated themselves to the widespread dissemination of research findings to the public.

Dr. Golinkoff and Dr. Hirsh-Pasek have received a number of grants from IES, spanning three topic areas across the two research centers.  In 2011, their research team, led by Dr. Golinkoff, received an award to systematically develop a computerized language assessment for preschool children, which has resulted in a reliable and valid product, the Quick Interactive Language Screener (QUILS).  The research team recently published the QUILS, which is now available online.  Based on the success of the assessment for preschoolers, they received a grant from the National Center for Special Education Research in 2016 to expand the QUILS program to assess 2-year-old children, creating an instrument that can be used for early screening of children at risk for language disabilities.

In another area, their research team (led by David Dickinson) received a 2011 National Center for Education Research (NCER) grant to develop and pilot test an intervention designed to foster vocabulary development in preschool children from low-income homes through shared book reading and guided play. The same team, led by Hirsh-Pasek, received a subsequent award in 2015 to extend this work to create a toolkit of shared reading combined with teacher-led playful learning experiences, such as large group games, board games, digital games, songs, and socio-dramatic play.

In addition, Golinkoff led a research team on a 2014 NCER grant to explore how modeling and feedback, gesture, and spatial language affect children’s spatial skills measured through both concrete and digital delivery. 

Written by Amy Sussman (NCSER), Caroline Ebanks (NCER), and Erin Higgins (NCER)

Building Evidence: What Comes After an Efficacy Study?

Over the years, the Institute of Education Sciences (IES) has funded over 300 studies across its research programs that evaluate the efficacy of specific programs, policies, or practices. This work has contributed significantly to our understanding of the interventions that improve outcomes for students under tightly controlled or ideal conditions. But is this information enough to inform policymakers’ and practitioners’ decisions about whether to adopt an intervention? If not, what should come after an efficacy study?

In October 2016, IES convened a group of experts for a Technical Working Group (TWG) meeting to discuss next steps in building the evidence base after an initial efficacy study, and the specific challenges that are associated with this work. TWGs are meant to encourage stakeholders to discuss the state of research on a topic and/or to identify gaps in research.  

Part of this discussion focused on replication studies and the critical role they play in the evidence-building process. Replication studies are essential for verifying the results of a previous efficacy study and for determining whether interventions are effective when certain aspects of the original study design are altered (for example, testing an intervention with a different population of students). IES has supported replication research since its inception, but there was general consensus that more replications are needed.

TWG participants discussed some of the barriers that may be discouraging researchers from doing this work. One major obstacle is the idea that replication research is somehow less valuable than novel research—a bias that could be limiting the number of replication studies that are funded and published. A related concern is that the field of education lacks a clear framework for conceptualizing and conducting replication studies in ways that advance evidence about beneficial programs, policies and practices (see another recent IES blog post on the topic).

IES provides support for studies to examine the effectiveness of interventions that have prior evidence of efficacy and that are implemented as part of the routine and everyday practice occurring in schools without special support from researchers. However, IES has funded a relatively small number of these studies (14 across both Research Centers). TWG participants discussed possible reasons for this and pointed out several challenges related to replicating interventions under routine conditions in authentic education settings. For instance, certain school-level decisions can pose challenges for conducting high-quality effectiveness studies, such as restricting the length that interventions or professional development can be provided and choosing to offer the intervention to students in the comparison condition. These challenges can result in findings that are influenced more by contextual factors rather than the intervention itself. TWG participants also noted that there is not much demand for this level of evidence, as the distinction between evidence of effectiveness and evidence of efficacy may not be recognized as important by decision-makers in schools and districts.

In light of these challenges, TWG participants offered suggestions for what IES could do to further support the advancement of evidence beyond an efficacy study. Some of these recommendations were more technical and focused on changes or clarifications to IES requirements and guidance for specific types of research grants. Other suggestions included:

  • Prioritizing and increasing funding for replication research;
  • Making it clear which IES-funded evaluations are replication studies on the IES website;
  • Encouraging communication and partnerships between researchers and education leaders to increase the appreciation and demand for evidence of effectiveness for important programs, practices, and policies; and
  • Supporting researchers in conducting effectiveness studies to better understand what works for whom and under what conditions, by offering incentives to conduct this work and encouraging continuous improvement.

TWG participants also recommended ways IES could leverage its training programs to promote the knowledge, skills, and habits that researchers need to build an evidence base. For example, IES could emphasize the importance of training in designing and implementing studies to develop and test interventions; create opportunities for postdoctoral fellows and early career researchers to conduct replications; and develop consortiums of institutions to train doctoral students to conduct efficacy, replication, and effectiveness research in ways that will build the evidence base on education interventions that improve student outcomes.

To read a full summary of this TWG discussion, visit the Technical Working Group website or click here to go directly to the report (PDF).

Written by Katie Taylor, National Center for Special Education Research, and Emily Doolittle, National Center for Education Research

What Are the Payoffs to College Degrees, Credentials, and Credits?

The Center for Analysis of Postsecondary Education and Employment (CAPSEE) is an IES-funded Research and Development Center that seeks to advance knowledge regarding the link between postsecondary education and the labor market. CAPSEE was funded through a 2011 grant from the National Center for Education Research (NCER) and is in the process of completing its work. CAPSEE will hold a final conference to discuss its findings on April 6 & 7 in Washington, DC.

Recently, Tom Bailey (pictured), Director of the Community College Research Center, Columbia University, Teachers College, and the Principal Investigator for CAPSEE, answered questions from James Benson, the NCER Program Officer for the R & D center.

Can you describe some of the original goals of CAPSEE?

We were especially interested in the economic benefits of a college education for community college students, including those who complete awards (Associate’s degrees or certificates) and those who do not, as well as those who transfer to four-year colleges. We were also interested in differences in earnings by field of study. When we started CAPSEE in 2012 there were a lot of studies that used survey datasets to look, in general, at the returns to completing a Bachelor’s degree. The CAPSEE approach was to use large-scale statewide databases and follow college students over time, to look in detail at their earnings before, during, and after college.

In addition, CAPSEE researchers sought to examine two key policy issues. One was how financial aid and working while enrolled affect students’ performance in college and their labor market outcomes. The other was whether for-profit colleges help students get better jobs.

You have synthesized findings from analyses in six states. What are your main findings?

We found that, in general, Associate’s degrees have good returns in the labor market; they’re a good investment for the individual and for society. However, there is quite a bit of variation in returns by program. For students in Associate degree programs primarily designed to prepare them for transfer to a four-year college, if they don’t transfer, their degrees will not be worth very much. But when they complete vocational degrees, especially in health-related fields, the earnings gains are usually strong and persistent (and robust to how we estimated them). Also, we did a lot of research on certificates, credentials that many see as the best fit for students on the margin of going to college. We found benefits to students who completed certificates, again especially in fields that directly relate to an occupation or industry. And finally, we examined outcomes for students who enrolled and took courses without attaining a degree or certificate. We found that their after-college earnings increased in proportion to the number of credits they earned.

"The fundamental policy implication is that college is a good investment."

What do you see as the key policy implications of these findings?

The fundamental policy implication is that college is a good investment. This merits emphasis because there are repeated critiques of college in terms of how much it costs and how much debt students accumulate. That said policymakers do need to think about the value of each postsecondary program. Even within the same institution, programs have very different outcomes. Yet on average, attending college for longer and attaining more credits has beneficial effects. Policymakers should see this evidence as supporting public and private investments in college.

What did you discover about the relationships between financial aid, college outcomes, and labor market outcomes?

In an era of tight public resources, the effectiveness of financial aid policy is a crucial issue. Financial aid does help students persist in college, but one way to promote greater effectiveness is through academic performance standards for students receiving federal financial aid. These have existed in the federal need-based aid programs for nearly 40 years, in the form of Satisfactory Academic Progress (SAP) requirements. These have not received much attention. Our research on SAP suggests such policies have heterogeneous effects on students in the short term: they increase the likelihood that some students will drop out, but appear to motivate higher grades for students who remain enrolled. After three years, however, the negative effects dominate. Though it has little benefit for students in the long term, SAP policy appears to increase the efficiency of aid expenditures because it discourages students who have lower-than-average course completion rates from persisting. But the policy also appears to exacerbate inequality in higher education by pushing out low-performing, low-income students faster than their equally low-performing, higher-income peers.

Many students work while in college.  Does this seem to help or hurt students in the long run?

Our research found that for Federal Work-Study (FWS) participants who would have worked even in the absence of the program, FWS reduces hours worked and improves academic outcomes but has little effect on post-college employment outcomes. For students who would not have worked, the effects are reversed: the program has little effect on graduation, but a positive effect on post-college employment.  Results are more positive for participants at public institutions, who tend to be lower income than participants at private institutions. Our findings suggest that better targeting to low-income and lower-scoring students could improve FWS outcomes. This is consistent with much of the CAPSEE research—you need more detail and specificity to really understand the relationship between education and employment and earnings.

What did you learn about credentials from for-profit institutions?

Our findings on students at for-profit colleges were quite pessimistic. Although enrollment in for-profit colleges grew significantly after 2000, the sector has been declining during the last two years, as evidence on inferior outcomes – particularly with regard to student debt – emerged. In general, our researchers found that for-profit students have worse labor market outcomes than comparable community college students although in some cases the difference is not statistically significant. Our evidence suggests that these colleges need to be monitored to ensure they are delivering a high-quality, efficient education.

You are holding the final CAPSEE conference in April. What do you hope people will get out of it?

At the conference, we will focus on several important and controversial policy questions related to higher education: 

  • Have changes in tuition and the labor market created conditions in which college is not worth it for some students, contributing to an unsupportable increase in student debt? 
  • Has higher education contributed to inequality rather than promoting economic mobility? 
  • Is continued public funding of college a worthwhile investment? 
  • Should public funding be used only for some programs of study?  
  • What are the arguments for and against making community college free?
  • Can changes in the operations and functioning of colleges change the return on investment from a college education for both the individual and society?
  • How important should information on earnings outcomes be for accreditation decisions and/or for eligibility of students to receive financial aid? 

At the conference participants will have the opportunity to discuss and learn about these issues drawing on five years of CAPSEE research as well as input from other experts.

A Growing Body of Research on Growth Mindset

Growth mindset is the belief that we can grow our intelligence by working hard at it and the idea has attracted a lot of interest in the education world in recent years. There have been best-selling books and many magazine and newspaper articles written about the power of a growth mindset.

In a recent national survey*, nearly all (98 percent) of 600 K-12 teachers said they think that a growth mindset improves their own teaching and helps their students learn. However, only 20% reported confidence in actually being able to help their students develop a growth mindset. This disparity highlights a need for additional research and development of growth-mindset based interventions, as well as research to understand how to best optimize implementation and outcomes.

For more than a decade, the Institute of Education Sciences (IES) has supported research on growth mindset. This includes a set of basic research studies to test a theory about growth mindset, an R&D project to build a technology-based growth mindset intervention, and an efficacy study to evaluate the impact of that intervention.

With a 2002 award from the IES Cognition and Student Learning Program (as well as grants from private foundations), researchers at Columbia and Stanford Universities conducted basic research to test and grow the growth mindset theory. This research provided a foundation for future R&D to develop school-based interventions focusing on applying growth mindset to student learning.

With a 2010 award from the ED/IES SBIR program, small business firm Mindset Works developed a web-based intervention to support teachers and grade 5 to 9 students in applying a growth mindset to teaching and learning. The Brainology intervention (pictured below) includes 20 animated interactive lessons and classroom activities for students on how the brain works and how it can become smarter and stronger through practice and learning. The intervention also teaches students specific neuroscience-based strategies to enhance attention, engagement, learning, and memory, and to manage negative emotions.

Brainology includes support materials for teachers to help them integrate the program and growth mindset concepts more generally into their daily activities at school. It is currently being used in hundreds of schools around the country.

And through a 2015 award from the Social and Behavioral Context for Academic Learning Program, researchers are now studying the efficacy of Brainology to improve students’ growth mindset and academic learning. In this four-year study, sixth- and seventh-grade science teachers are randomly assigned to either implement the program along with their school’s regular science curriculum or  continue with the regular science curriculum alone. Impacts of the growth mindset program on student mindsets and achievement (grades and test scores) are being measured in the early spring of the implementation year and in the fall of the following school year.
 
Follow us on Twitter and Facebook, or stay tuned to this blog, for more information about these and other research projects related to growth mindset.
 

Written by Emily Doolittle, NCER’s team lead for Social and Behavioral Research, and Ed Metz, ED/IES SBIR Program Manager

* - The survey indicates that growth mindset is of high interest to the general public and the education community. However, the Institute of Education Sciences was not involved in this survey and has not reviewed the methodology or results.