IES Blog

Institute of Education Sciences

IES Grantees Recognized by Council for Exceptional Children

Several IES-funded researchers were recently recognized for their contributions to the field of special education by the Council for Exceptional Children (CEC) Division of Research. They were honored at the CEC Convention and Expo in April.

Kathleen Lane is the 2017 recipient of CEC’s Kauffman-Hallahan-Pullen Distinguished Research Award, which recognizes individuals or research teams who have made outstanding scientific contributions in basic or applied research in special education over the course of their careers.

Dr. Lane (pictured, right), Professor in the Department of Special Education at the University of Kansas’ School of Education, received a 2006 National Center for Special Education Research (NCSER) grant through which she refined and pilot tested Project WRITE, a writing intervention focused on students in elementary school with emotional and behavioral disorders (EBD). She is currently the PI of a researcher-practitioner partnership project with Lawrence Public Schools in Kansas, examining the implementation of the Comprehensive, Integrated, Three-tiered (CI3T) Model of Prevention, which blends principles of Response-to-Intervention and Positive Behavior Interventions and Supports. In addition, she served as one of the co-chairs of the 2016 IES Principal Investigators’ Meeting and is currently serving as a primary mentor to another award recipient, Robin Parks Ennis (see below).

Erin Barton and Christopher Lemons are the recipients of the 2017 Distinguished Early Career Research Award, an honor that recognizes individuals with outstanding scientific contributions in special education research within the first 10 years after receiving a doctoral degree. They are both Assistant Professors of Special Education at Vanderbilt University’s Peabody College of Education and Human Development.

Dr. Barton (pictured, far left) is currently developing and pilot testing the Family Behavior Support App, an intervention aimed at supporting parents of young children with disabilities and challenging behaviors. Dr. Lemons (pictured, near left) served as Principal Investigator (with Cynthia Puranik) on two IES-funded projects – a NCSER-funded project focused on developing an intervention to improve reading instruction for children with Down Syndrome as well as a project funded by the National Center for Education Research that focused on developing an intervention to help kindergarten children learn to write. He was also a recipient of a Presidential Early Career Award for Scientists and Engineers (PECASE) in 2016.

Robin Parks Ennis (pictured, right) is the recipient of the 2017 Distinguished Early Career Publication Award, which recognizes an outstanding research publication by an individual within the five years of receiving a doctoral degree.

Dr. Ennis, an Assistant Professor of Curriculum and Instruction at the University of Alabama at Birmingham, is recognized for her paper, “Classwide Teacher Implementation of Self-Regulated Strategy Development in Writing with Student with E/BD in a Residential Facility,” published in the Journal of Behavioral Education. She is currently the PI of a NCSER-funded Early Career Development and Mentoring grant in which she is developing a professional development model for teachers to implement a classroom-based, low-intensity strategy called Instructional Choice for students with and at risk for Emotional Disturbance.

Last year’s CEC Distinguished Early Career Research Award recipient and NCSER-funded researcher, Brian Boyd (pictured, left), gave an invited presentation at this year’s convention on Advancing Social-Communication and Play (ASAP). This is an intervention targeting the social-communication and play skills of preschoolers with autism. Dr. Boyd is an Associate Professor at the University of North Carolina’s School of Medicine.

Congratulations to all the CEC Division of Research Award Winners!

Written by Wendy Wei, Program Assistant, and Amy Sussman, Program Officer, NCSER

 

 

 

 

 

 

 

Every Transition Counts for Students in Foster Care

EDITOR’S NOTE: The Institute of Education Science funds and supports Researcher-Practitioner Partnerships (RPP) that seek to address significant challenges in education. In this guest blog post, Elysia Clemens (pictured left), of the University of Northern Colorado, and Judith Martinez (pictured right), of the Colorado Department of Education, describe the work that their IES-funded RPP is doing to better understand and improve outcomes for students in foster care.

May is Foster Care Awareness Month and 2017 is an important year for raising awareness of the educational outcomes and educational stability of students in foster care.

With passage of the Every Student Succeeds Act of 2015 (ESSA), provisions are now in place for states to report on the academic performance and status of students in foster care. ESSA also requires collaboration between child welfare and education agencies to ensure the educational stability (PDF) of students while they are in foster care. This includes reducing the number of school changes to those that are in a student’s best interest and ensuring smooth transitions when changing schools is necessary.

To address the need for baseline data on how students in foster care are faring academically, the University of Northern Colorado, the Colorado Department of Education, and the Colorado Department of Human Services formed a researcher-practitioner partnership in 2014. This IES-funded partnership is currently researching the connection between child welfare placement changes and school changes and how that relates to the academic success of students.

Our goals are to raise awareness of gaps in academic achievement and educational attainment, inform the application of educational stability research findings to the implementation of ESSA’s foster care provisions, and develop and maintain high-quality data that can be easily accessed and used.

Achievement and educational attainment

Until recently, Colorado students in foster care were not identified in education data sets, and child welfare agencies did not always know how the youth in their care were faring in school. The Colorado partnership linked child welfare and education data from 2008 forward and found that across school years, grade levels, and subject areas, there is an academic achievement gap of at least 20 percentage points between students in foster care and their peers (see chart from the partnership website below).

The most critical subject area was mathematics, where the proportion of students scoring in the lowest proficiency category increased with each grade level. The data also revealed that less than one in three Colorado students who experience foster care graduate with their class.


Source: The Colorado Study of Students in Foster Care (http://www.unco.edu/cebs/foster-care-research/needs-assessment-data/academic-achievement/)


Like many states, Colorado has a long way to go toward closing academic achievement gaps for students in foster care, but with the availability of better data, there is a growing interest in the educational success of these students statewide.  

Educational Stability

Educational stability provisions, such as the ones in ESSA, are designed to reduce barriers to students’ progress, such as unnecessary school moves, gaps in enrollment, and delays in the transfer of records. To estimate how much implementation of these provisions might help improve educational stability for students in foster care, we used child welfare placement dates and school move dates to determine the proportion of school moves associated with changes in child welfare placements. A five-year analysis of school moves before, during, and after foster care placements revealed that the educational stability provisions in the ESSA would apply to two-thirds of the school moves Colorado students experienced.

To fully realize this policy opportunity, we began by generating heat maps on where foster student transfers occur (an example is pictured to the right). These geographical data are being used by Colorado Department of Education and Colorado Department of Human Services to prioritize relationship-building among specific local education agencies and child welfare agencies. Regional meetings are being held to strengthen local collaboration in implementing ESSA’s mandates regarding educational stability and transportation plans.

We also summarized the frequency of school moves by the type of child welfare placement change (e.g., entry into care, transitions among different types of out-of-home placements). We found that nearly one-third of Colorado students who enter foster care also move schools at the same time. This finding can help child welfare and education agencies anticipate the need for short-term transportation solutions and develop procedures for quickly convening stakeholders to determine if a school move is in a child’s best interest.

Accessible and Usable Data

A key communication strategy of the Colorado partnership is to make the descriptive data and research findings accessible and actionable on our project website. The data and findings are organized with different audiences in mind, so that advocates, practitioners, grant writers, and policy makers can use this information for their own distinct purposes. 

The website includes infographics that provide an overview of the data and recommendations on how to close gaps; dynamic visualizations that allow users to explore the data in-depth; and reports that inform conversations and decisions about how to best serve students in foster care.

In our final year of this IES RPP grant, we will continue to identify opportunities to apply our research to inform the development of quality transportation plans and local agreements. We also will study how the interplay between the child welfare placement changes relates to academic progress and academic growth.

 

IES Grantees Receive SRCD Distinguished Scientific Contributions Award

Two Institute of Education Sciences (IES) grantees were recently recognized by the Society for Research in Child Development (SRCD) for their lifetime contributions to the knowledge and understanding of child development.

Roberta Golinkoff and Kathy Hirsh-Pasek received SRCD’s Distinguished Scientific Contributions to Child Development Award in April. It is the first time a team received the award. Dr. Golinkoff (pictured, right) is the Unidel H. Rodney Sharp Chair in the School of Education at University of Delaware and Dr. Hirsh-Pasek (pictured, left) is the Stanley and Debra Lefkowitz Faculty Fellow in the Department of Psychology at Temple University and a Senior Fellow at the Brookings Institution.

The duo has been collaborating on research in a variety of areas of young children’s development and education for several decades, including pioneering work in language, spatial development, and learning through play. They have also dedicated themselves to the widespread dissemination of research findings to the public.

Dr. Golinkoff and Dr. Hirsh-Pasek have received a number of grants from IES, spanning three topic areas across the two research centers.  In 2011, their research team, led by Dr. Golinkoff, received an award to systematically develop a computerized language assessment for preschool children, which has resulted in a reliable and valid product, the Quick Interactive Language Screener (QUILS).  The research team recently published the QUILS, which is now available online.  Based on the success of the assessment for preschoolers, they received a grant from the National Center for Special Education Research in 2016 to expand the QUILS program to assess 2-year-old children, creating an instrument that can be used for early screening of children at risk for language disabilities.

In another area, their research team (led by David Dickinson) received a 2011 National Center for Education Research (NCER) grant to develop and pilot test an intervention designed to foster vocabulary development in preschool children from low-income homes through shared book reading and guided play. The same team, led by Hirsh-Pasek, received a subsequent award in 2015 to extend this work to create a toolkit of shared reading combined with teacher-led playful learning experiences, such as large group games, board games, digital games, songs, and socio-dramatic play.

In addition, Golinkoff led a research team on a 2014 NCER grant to explore how modeling and feedback, gesture, and spatial language affect children’s spatial skills measured through both concrete and digital delivery. 

Written by Amy Sussman (NCSER), Caroline Ebanks (NCER), and Erin Higgins (NCER)

Using the WWC as a Teaching Tool

EDITOR'S NOTE:The What Works Clearinghouse (WWC), a program of the Institute of Education Sciences, is a trusted source of scientific evidence on education programs, products, practices, and policies. The WWC also has many tools and resources for education researchers and students.  In this guest blog post, Jessaca Spybrook (pictured, below right), Associate Professor of Evaluation, Measurement and Research at Western Michigan University, discusses how she uses WWC procedures and standards as a teaching tool.


By Jessaca Spybrook, Western Michigan University

TraiJessaca Spybrookning the next generation of researchers so they are prepared to enter the world of education research is a critical part of my role as a faculty member in the Evaluation, Measurement, and Research program. I want to ensure that my students have important technical skills in a host of subject areas including, but not limited to, research design, statistics, and measurement. At the same time, I want to be sure they know how to apply the skills to design and analyze real-world studies. I often struggle to find resources for my classes that help me meet both goals.

One resource that has emerged as an important tool in meeting both goals is the What Works Clearinghouse website. I frequently integrate materials from the WWC into the graduate research design and statistics courses I teach.

For example, in a recent class I taught, Design of Experiments and Quasi-Experiments, I used the WWC Procedures and Standards Handbook Version 3.0 throughout (an image from the publication is pictured below). The Handbook met four important criteria as I was selecting resources for my class:

  1. Inclusion of important technical detail on design and analysis;
  2. Up-to-date and current thinking and “best practice” in design and analysis;
  3. Clear writing that is accessible for graduate students; and
  4. It was free (always a bonus when searching for class materials).Image from the What Works Clearinghouse Standards & Practices Guide 3.0

By no means did the Handbook replace classic and well-regarded textbooks in the class. Rather, it helped connect classic texts on design to both recent advances related to design, as well as real-life considerations and standards that designs are judged against.

At the end of my class, students may have been tired of hearing the question, “what is the highest potential rating for this study?” But I feel confident that using the WWC Handbook helped me not only prepare graduates with the technical know-how they need to design a rigorous experiment or quasi-experiment, but also raised awareness of current best practice and how to design a study that meets important standards set for the field.

 

Building Evidence: What Comes After an Efficacy Study?

Over the years, the Institute of Education Sciences (IES) has funded over 300 studies across its research programs that evaluate the efficacy of specific programs, policies, or practices. This work has contributed significantly to our understanding of the interventions that improve outcomes for students under tightly controlled or ideal conditions. But is this information enough to inform policymakers’ and practitioners’ decisions about whether to adopt an intervention? If not, what should come after an efficacy study?

In October 2016, IES convened a group of experts for a Technical Working Group (TWG) meeting to discuss next steps in building the evidence base after an initial efficacy study, and the specific challenges that are associated with this work. TWGs are meant to encourage stakeholders to discuss the state of research on a topic and/or to identify gaps in research.  

Part of this discussion focused on replication studies and the critical role they play in the evidence-building process. Replication studies are essential for verifying the results of a previous efficacy study and for determining whether interventions are effective when certain aspects of the original study design are altered (for example, testing an intervention with a different population of students). IES has supported replication research since its inception, but there was general consensus that more replications are needed.

TWG participants discussed some of the barriers that may be discouraging researchers from doing this work. One major obstacle is the idea that replication research is somehow less valuable than novel research—a bias that could be limiting the number of replication studies that are funded and published. A related concern is that the field of education lacks a clear framework for conceptualizing and conducting replication studies in ways that advance evidence about beneficial programs, policies and practices (see another recent IES blog post on the topic).

IES provides support for studies to examine the effectiveness of interventions that have prior evidence of efficacy and that are implemented as part of the routine and everyday practice occurring in schools without special support from researchers. However, IES has funded a relatively small number of these studies (14 across both Research Centers). TWG participants discussed possible reasons for this and pointed out several challenges related to replicating interventions under routine conditions in authentic education settings. For instance, certain school-level decisions can pose challenges for conducting high-quality effectiveness studies, such as restricting the length that interventions or professional development can be provided and choosing to offer the intervention to students in the comparison condition. These challenges can result in findings that are influenced more by contextual factors rather than the intervention itself. TWG participants also noted that there is not much demand for this level of evidence, as the distinction between evidence of effectiveness and evidence of efficacy may not be recognized as important by decision-makers in schools and districts.

In light of these challenges, TWG participants offered suggestions for what IES could do to further support the advancement of evidence beyond an efficacy study. Some of these recommendations were more technical and focused on changes or clarifications to IES requirements and guidance for specific types of research grants. Other suggestions included:

  • Prioritizing and increasing funding for replication research;
  • Making it clear which IES-funded evaluations are replication studies on the IES website;
  • Encouraging communication and partnerships between researchers and education leaders to increase the appreciation and demand for evidence of effectiveness for important programs, practices, and policies; and
  • Supporting researchers in conducting effectiveness studies to better understand what works for whom and under what conditions, by offering incentives to conduct this work and encouraging continuous improvement.

TWG participants also recommended ways IES could leverage its training programs to promote the knowledge, skills, and habits that researchers need to build an evidence base. For example, IES could emphasize the importance of training in designing and implementing studies to develop and test interventions; create opportunities for postdoctoral fellows and early career researchers to conduct replications; and develop consortiums of institutions to train doctoral students to conduct efficacy, replication, and effectiveness research in ways that will build the evidence base on education interventions that improve student outcomes.

To read a full summary of this TWG discussion, visit the Technical Working Group website or click here to go directly to the report (PDF).

Written by Katie Taylor, National Center for Special Education Research, and Emily Doolittle, National Center for Education Research