IES Blog

Institute of Education Sciences

Results from a Study of the IES Researcher-Practitioner Partnership Grants

Funding research that makes it into classrooms and impacts the lives of students is an important goal for IES. One of the ways in which the Institute strives to make sure the work it funds is useful to education practitioners is through the funding of Researcher-Practitioner Partnerships (RPPs), which are grants to support researchers and education agencies working together to answer questions of high priority for the education agency. IES began funding these RPP projects in 2013 with the goal of supporting partnership development, initial research on problems of high priority for the education agency, and increasing the education agency’s capacity to use and understand research.

Because the RPP program is relatively new and little is known about the best way to frame and support these partnerships, IES funded the National Center for Research in Policy and Practice (NCRPP; an IES-funded national research and development center on knowledge utilization) to conduct a supplemental study to look at how researcher and practitioner partners form and maintain their partnerships, and strategies they use to increase research use by practitioners at the education agency. NCRPP also wanted to understand more about the IES RPP grant program in terms of funding amount, time frame, and Institute leadership and monitoring.

NCRPP just released its final report on the findings from the RPP study. In two waves of data collection, NCRPP distributed surveys and conducted interviews with key researchers and practitioners involved in 27 of the 28 RPPs funded by IES between 2013 and 2015. Findings from the report show that partnerships reported progress on goals related to developing findings that apply to other organizations and improving students’ socio-emotional outcomes. Additionally, researchers and practitioners both reported valuing their participation in the partnership work. Finally, when compared to a national sample of school and district leaders, practitioners who participate in an IES-funded RPP are much more likely to name journal articles as useful pieces of research and to indicate they integrate research processes into their own work.

You can read a summary about the findings on the NCRPP website here, or download the full interim report here.

Becky McGill-Wilkinson, NCER Program Officer for the Knowledge Utilization Research and Development Centers

Building Evidence: Changes to the IES Goal Structure for FY 2019

The IES Goal Structure was created to support a continuum of education research that divides the research process into stages for both theoretical and practical purposes. Individually, the five goals – Exploration (Goal 1), Development and Innovation (Goal 2), Efficacy and Replication (Goal 3), Effectiveness (Goal 4), and Measurement (Goal 5) – were intended to help focus the work of researchers, while collectively they were intended to cover the range of activities needed to build evidence-based solutions to the most pressing education problems in our nation. Implicit in the goal structure is the idea that over time, researchers will identify possible strategies to improve student outcomes (Goal 1), develop and pilot-test interventions (Goal 2), and evaluate the effects of interventions with increasing rigor (Goals 3 and 4).

Over the years, IES has received many applications and funded a large number of projects under Goals 1-3.  In contrast, IES has received relatively few applications and awarded only a small number of grants under Goal 4. To find out why – and to see if there were steps IES could take to move more intervention studies through the evaluation pipeline – IES hosted a Technical Working Group (TWG) meeting in 2016 to hear views from experts on what should come after an efficacy study (see the relevant summary and blog post). IES also issued a request for public comment on this question in July 2017 (see summary).

The feedback we received was wide-ranging, but there was general agreement that IES could do more to encourage high-quality replications of interventions that show prior evidence of efficacy. One recommendation was to place more emphasis on understanding “what works for whom” under various conditions.  Another comment was that IES could provide support for a continuum of replication studies.  In particular, some commenters felt that the requirements in Goal 4 to use an independent evaluator and to carry out an evaluation under routine conditions may not be practical or feasible in all cases, and may discourage some researchers from going beyond Goal 3.   

In response to this feedback, IES revised its FY 2019 RFAs for Education Research Grants (84.305A) and Special Education Research Grants (84.324A) to make clear its interest in building more and better evidence on the efficacy and effectiveness of interventions. Among the major changes are the following:

  • Starting in FY 2019, Goal 3 will continue to support initial efficacy evaluations of interventions that have not been rigorously tested before, in addition to follow-up and retrospective studies.
  • Goal 4 will now support all replication studies of interventions that show prior evidence of efficacy, including but not limited to effectiveness studies.
  • The maximum amount of funding that may be requested under Goal 4 is higher to support more in-depth work on implementation and analysis of factors that moderate or mediate program effects.

The table below summarizes the major changes. We strongly encourage potential applicants to carefully read the RFAs (Education Research, 84.305A and Special Education Research, 84.324A) for more details and guidance, and to contact the relevant program officers with questions (contact information is in the RFA).

Applications are due August 23, 2018 by 4:30:00 pm Washington DC time.

 

Name Change

Focus Change

Requirements Change

Award Amount Change

Goal 3

Formerly “Efficacy and Replication;” in FY2019, “Efficacy and Follow-Up.”

Will continue to support initial efficacy evaluations of interventions in addition to follow-up and retrospective studies.

No new requirements.

No change.

Goal 4

Formerly “Effectiveness;” in FY2019, “Replication: Efficacy and Effectiveness.”

Will now support all replications evaluating the impact of an intervention. Will also support Efficacy Replication studies and Re-analysis studies.

Now contains a requirement to describe plans to conduct analyses related to implementation and analysis of key moderators and/or mediators. (These were previously recommended.)

Efficacy Replication studies maximum amount: $3,600,000.

Effectiveness studies maximum amount: $4,000,000.

Re-analysis studies maximum amount: $700,000.

 

 

By Thomas Brock (NCER Commissioner) and Joan McLaughlin (NCSER Commissioner)

Announcing ED/IES SBIR’s 2018 Awards: Funding the Next Generation of Education Technology

In recent years, thousands of schools around the country have used technologies developed through the Small Business Innovation Research program at the U.S. Department of Education’s Institute of Education Sciences (ED/IES SBIR). The program emphasizes a rapid research and development (R&D) process, with rigorous research informing iterative development and evaluating the promise of products for improving the intended outcomes. ED/IES SBIR also focuses on the commercialization after development is complete so that products can reach schools and be sustained over time.


This month, IES announced 21 new awards for 2018. Of these, 15 are Phase I projects to develop and test a prototype, and six are Phase II projects to fully develop and evaluate an education technology product for students, teachers, or administrators to use in classrooms and schools. A playlist of videos from the Phase II projects is available below. 

Many of the new projects continue trends that have emerged across the portfolio in recent years, including creating learning games and dashboards that present data to inform learning and instruction in core subjects like reading and math. Several other awards are for projects that promote learning in new areas, such as computer science and Career and Technical Education.

Trend #1: Learning Games

Games are increasingly being used to engage students in learning by presenting content in new ways. For the eighth straight year, many new ED/IES SBIR awardees will be developing game-based learning products.

  • Phase II awardee Electric Funstuff and Phase I awardee Schell Games are developing virtual reality (VR) games to immerse students in history in 360-degree environments, and Phase I awardee Gigantic Mechanic is developing a role-playing game on civic discourse facilitated by tablet-based computers.
  • With Phase I funding, Fablevision and Sirius Thinking are creating interventions that employ game mechanics to improve reading.
  • Several project teams are embedding storylines within learning games, including Phase II awardee MidSchoolMath for algebra, and Phase I awardees Codespark for computer science and Immersed Games for ecosystems science.
  • The 3C Institute is creating a game-based assessment of early grade science learning.

Trend #2: Dashboards for Students, Teachers and Administrators

Modern technologies provide the opportunity to organize and present data in real-time to students, teachers, and administrators to inform learning and decision-making. Several new awards are developing data dashboards.

  • Phase II awardees are fully developing dashboards across several areas. LiveSchool will generate reports on students’ behavior across classes with a recommendation engine for administrators and teachers to address challenges, StoryWorld will provide teachers of English Learners insights into students’ language acquisition, and Simbulus and Myriad Sensors will present real-time information to enrich classroom discussions on math and science topics.
  • Phase I projects by Appendis and Graspable are creating adaptive learning technologies which include teacher dashboards to present results on student performance to guide instruction. VidCode is creating a dashboard for teachers to monitor student progress in learning coding. Education Modified is developing a dashboard to provide special education teachers guidance on student IEPs, or individual educational plans.

New Areas of Focus

Along with continuing to support projects in the areas above, a series of Phase I 2018 awards are focusing on areas new to ED/IES SBIR.  VidCode, CodeSpark, and Zyante are focusing on computer science learning and Core Learning is seeking to build capacity in Career and Technical Education (CTE). Language Learning Partners is developing an automated avatar tutor to support English Learners through conversation. And in special education, Attainment Company is developing an app for supporting student writing.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Written by Edward Metz, Program Manager, ED/IES SBIR

Principals as Instructional Leaders and Managers—Not an “Either-Or”

Recently Morgaen Donaldson (University of Connecticut) and Madeline Mavrogordato (Michigan State), Peter Youngs (University of Virginia), and Shaun Dougherty (University of Connecticut) presented early results from their IES-funded study on principal evaluation policies at the AERA national conference. We asked the team to share their preliminary findings.

What is the purpose of your study?

There is widespread agreement among researchers, policymakers, and practitioners that principals play a critical role in providing high-quality education to students. The role of principals may grow even larger under the 2015 Every Student Succeeds Act, which grants districts and states more flexibility regarding how to promote effective principal leadership. However, we know remarkably little about what school districts can do to improve principals’ leadership practices.

Given the importance of principals and the relative dearth of research on how to improve their leadership, we have been studying the extent to which principal evaluation systems focus on learning-centered leadership, one promising conception of leadership, in 22 districts in Connecticut, Michigan, and Tennessee. We are examining associations between the types of leadership emphasized in principal evaluation policies, the leadership practices that principals implement in their schools, and student performance.  

What is the major focus of principal evaluation policies?

To date, through document analysis we have found that district principal evaluation policies heavily emphasize instructional leadership, which focuses on teaching and learning issues, and de-emphasize managerial leadership, which concentrates on administrative tasks such as budgeting and overseeing school facilities. Similarly, in interviews and surveys superintendents and principals report that their evaluation systems focus on instructional leadership. For example, one Michigan superintendent said, “the emphasis on education right now [is] to take the principals away from being a manager to being an instructional leader.”

How are administrators actually interpreting the policies?

Further investigation revealed a more nuanced relationship between written district policies and administrators’ interpretations of these policies, however. We found no relationship between written policies’ emphasis on instructional leadership and principals’ survey responses regarding whether their district focused on this type of leadership. Principals’ perceptions of their district’s focus on managerial leadership was related to the emphasis of this type of leadership in the policies, however. Thus, when districts placed a higher emphasis on managerial leadership in their written evaluation policies, principals reported that they perceived a stronger emphasis on this type of leadership.

Moreover, we found that holding constant the written policy’s emphasis on managerial leadership, there was an inverse relationship between the written policies’ emphasis on instructional leadership and the principals’ perceived policy emphasis on managerial leadership. Thus, the greater the written emphasis on instruction, the less principals perceived that their policy emphasized management.

In addition, interview data reveal that although superintendents state that they emphasize instructional leadership they in fact weigh managerial leadership quite heavily. In superintendents’ framing, a principal’s competence in managerial leadership enabled him or her to practice instructional leadership. Superintendents asserted that when principals addressed managerial concerns, they could progress to exercising instructional leadership. If principals were unable to address managerial issues, superintendents reported that they moved rapidly to intervene and potentially remove principals.

What are the next steps for your IES research project?

These preliminary findings add to a growing body of evidence suggesting a complex interplay between managerial and instructional leadership. They also reflect a longstanding tension between the two dominant conceptions of principal leadership among practitioners.  We plan to further examine the multifaceted relationship between instructional and managerial leadership as we continue our work on this project. We are currently surveying teachers about the types of leadership principals exercise in their schools and conducting a second round of interviews with superintendents to understand their perspectives in greater depth. In the next stage of the project, we will examine associations between the types of leadership emphasized in principal evaluation policies, the leadership practices that principals implement in their schools, and student performance in grades 3-8. 

Katina Stapleton, NCER Program Officer, oversees the project described in this blog post, and provided a framework for their responses.

Computerized Preschool Language Assessment Extends to Toddlers

Identifying young children with language delays can improve later outcomes

Language is a core ability that children must master for success both in and out of the classroom. Extensive studies have shown that many tasks, including math, depend on linguistic skill, and that early language skills are predictive of school readiness and academic success. Being able to quickly identify children at early ages with language delays is crucial for targeting effective interventions.

Enter the QUILS.

In 2011, the National Center for Education Research (NCER) at IES funded a 4-year grant to Dr. Roberta Golinkoff (University of Delaware) and Drs. Kathy Hirsh-Pasek (Temple University) and Jill de Villiers (Smith College) to develop a valid and reliable computer-based language assessment for preschoolers aged 3-5 years old. The resulting product was the Quick Interactive Language Screener (QUILS), a computerized tool to measure vocabulary, syntax, and language acquisition skills. The assessment ultimately measures what a child knows about language and how a child learns, and automatically provides results and reports to the teacher.

The preschool version of QUILS is now being used by early childhood educators, administrators, reading specialists, speech-language pathologists, and other early childhood professionals working with young children to identify language delays. The QUILS is also being utilized in other learning domains. For example, a new study relied on the QUILS, among other measures, to examine links between approaches to learning and science readiness in over 300 Head Start students aged 3 to 5 years.

QUILS is now being revised for use with toddlers. In 2016, the National Center for Special Education Research (NCSER) funded a 3-year study to revise the QUILS for use with children aged 24-36 months. The researchers have been testing the tool in both laboratory and natural (child care centers, homes, and Early Head Start programs) settings to determine which assessment items to use in the toddler version of QUILS. Ultimately, these researchers aim to develop a valid and reliable assessment to identify children with language delays so that appropriate interventions can begin early.

By Amanda M. Dettmer, AAAS Science & Technology Policy Fellow Sponsored by the American Psychological Association Executive Branch Science Fellowship