Inside IES Research

Notes from NCER & NCSER

IES Research Centers are Hiring

IES is seeking professionals in education-related fields to apply for open positions in our Research Centers, National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER). The Research Centers support research focused on practices and policies that improve education outcomes and access to education opportunities. Learn more about our work here: https://ies.ed.gov/ncer/ and here: https://ies.ed.gov/ncser/

If you are even potentially interested in this sort of position, we strongly encourage you to set up a profile in USAJobs (https://www.usajobs.gov/) and to upload your information now. As you build your profile, include all relevant research experience on your resume whether acquired in a paid or unpaid position. The positions will open in USAJobs on June 24, 2019 and will close as soon as 50 applications are received, or on July 8, 2019, whichever is earlier. Getting everything in can take longer than you might expect, so please apply as soon as the positions open in USAJobs (look for vacancy numbers IES-2019-0010 and IES-2019-2011).

IES Announces Forthcoming Funding Opportunity For the R&D of an “ROI Tool” to Inform Students’ Postsecondary Education and Career Decision Making

Students with electronic devices sitting against a wall.

Overview

On or about February 15, 2019, the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences (ED/IES SBIR) anticipates releasing a Special Topic Solicitation #91990019R0016 in Postsecondary Education. The solicitation will be announced through an IES Newsflash and will be posted here. It will request Phase I proposals for awards of up to $200,000 for 8 months to develop a prototype of a "ROI tool.” The tool will be designed to measure the costs versus benefits (the return on investment) of different postsecondary education and training programs to help students make well-informed choices about options to pursue after they complete high school.

Applicants must be a for-profit business 500 employees or less, and U.S. owned and operated. Applicants may partner with entities or organizations working on related initiatives in the field of postsecondary education, or may subcontract to non-profit researchers or individuals with specialized expertise as needed. The due date for submission for proposals will likely be on or about April 15, 2019, with awards in mid-June, and projects beginning shortly thereafter. All Phase I awardees will be eligible to apply for a Phase II award in 2020, for $900,000 for full scale development and research to test and validate the ROI tool.

Background

While many websites provide ways for students to explore colleges or careers and occupations of interest (e.g., such as College Scorecard and CareerOneStop), there is currently no tool that helps students understand the costs and benefits of individual postsecondary programs in an integrated, customizable, and user-friendly manner.  An ROI tool would likely combine information on individual program’s tuition and fees, time needed to complete, and expected earnings. Because these characteristics can vary significantly across programs and institutions, creating a single estimated measure of ROI would allow students to more easily compare postsecondary program options. If it helps students make better choices, it could lead to improved program completion rates, higher levels of employment and earnings, less education-related debt, and more satisfaction with their selected education and career paths. 

The ED/IES SBIR Special Topic intends to fund up to five (5) Phase I projects to (a) develop and research a prototype of an ROI tool, and (b) conduct planning and concept testing for a fully developed ROI tool that provides a user-friendly experience for students. The prototype of the ROI tool developed in Phase I shall integrate with one or more existing technology systems, data sets, data standards, or resources (such as CareerOneStop or College Scorecard), and add new data elements provided by an end-user.  After a successful Phase I project, it is anticipated that small businesses that win Phase II awards will complete the full-scale development of the ROI tool that was started in Phase I, including developing an interface to improve the experience of students using the ROI tool.

Because data for ROI at the program level may only be available from some states, regions, or sets of institutions at this time, it is expected that the scope of the ROI tool developed in Phase I & II would be limited and would not be an attempt to calculate ROI for every program and institution in the country. Applicants must propose a project scope that appropriately reflects the datasets that are to be integrated within the new ROI tool, and the amount of funding and time allotted for development and research of the SBIR awards in Phase I and II.  Small businesses that are interested in this solicitation must have expertise with related efforts in the field to enhance student choices by linking education and workforce information.

Potential applicants may submit questions to ED’s Contracting Specialist Kevin.Wade@ed.gov. All questions and responses will be posted publically on the same website where the solicitation is posted as Amendments to the Solicitation.

 

New Reports and Resources Around ELs and STEM

In recent months, several federal reports and resources related to English learner (EL) learning and education related to science, technology, engineering, and mathematics (STEM) have been released.

First, the Office of English Language Acquisition (OELA) released its third “data story” about ELs in US schools. This story, which builds on two previously released stories about the characteristics and educational experiences of ELs, focuses specifically on ELs’ NAEP performance and high school graduation rates. Through interactive infographics (many of which are built on data from the National Center for Education Statistics), the story shows that higher percentages of ELs are proficient in math than in reading, but that nearly half of all states experienced declines in the number of ELs who scored proficient in math between 2009 and 2017. The story also shows that graduation rates for ELs improved by 10 percentage points between 2010-11 and 2015-16 (from 57 percent to 67 percent), but still fall well below the rates for non-ELs (84 percent). While interesting and informative, the data story also underscores the necessity of research and development to produce better resources and information to support EL learning.

In that vein, the National Academies of Sciences, Engineering, and Medicine released English Learners in STEM Subjects: Transforming Classrooms, Schools, and Lives. This report examines what we know about ELs’ learning, teaching, and assessment in STEM subjects and provides guidance on how to improve STEM learning outcomes for these students. It reflects the consensus of a committee of EL experts that was chaired by NCER and NCSER grantee Dr. David Francis and included past grantees Dr. Okhee Lee and Dr. Mary Schleppegrell alongside a dozen other experts in EL education, STEM education, and teaching. One of the report’s central conclusions is that ELs develop proficiency in both STEM subjects and language when their classroom teachers provide them with opportunities for meaningful interaction and actively support both content and language learning. Given that many STEM teachers do not receive preparation to teach in this way, the report provides several recommendations to improve pre-service and in-service training. It also includes recommendations for how developers and publishers might produce better instructional materials and assessments to help both teachers and EL students. 

Efforts of both types – instructional preparation and development of new materials – may be further supported by two new toolkits released by the Office of Education Technology. The toolkits are designed for educators and developers, and each is organized around five specific guiding principles to help the targeted group approach education technology with ELs’ unique needs in mind. The principles for developers emphasize the importance of thinking ahead about EL needs for those who wish to make products for this population. Meanwhile, the educator principles center on issues of awareness, and encourage teachers to learn more about the features, platforms, and resources that are available for ELs in the world of education technology. The principles also complement one another – for example, developers are encouraged to offer instruction-focused professional development, and educators are encouraged to seek out the same.

Brought together, these resources provide a snapshot of ELs’ mathematics achievement, a summary of research evidence about learning and instruction for ELs in STEM, and a set of principles to guide instruction and development efforts in the technology space moving forward. They also make a clear case for continued investment in R&D efforts to support STEM learning for both EL students and their teachers. Since 2010, the National Center for Education Research has invested nearly $20 million across 13 research and researcher-practioner partnership grants that have focused on STEM learning and ELs. Several such grants are coming to a close in the 2019 fiscal year; watch this space for future blog posts about the products and findings from these projects.

 

Companion Guidelines on Replication and Reproducibility in Education Research

Just over five years ago the Institute of Education Sciences (IES) and the National Science Foundation (NSF) released the Common Guidelines for Education Research and Development. The Guidelines provided the expected purposes, justifications, and contributions of various types of research aimed at improving our knowledge of interventions and strategies for improving teaching and learning.  Since 2013, there has been increased attention to replication and reproducibility studies and their role in building the evidence base. In response to this interest and the importance of this work, the two organizations jointly issued the new Companion Guidelines on Replication and Reproducibility in Education Research to supplement the Common Guidelines for Education Research and Development. The companion document provides guidance on the steps researchers can take to promote corroboration, ensure the integrity of research, and extend the evidence base.

The Companion Guidelines identify principles to help education stakeholders design and report reproducibility and replication studies. These principles are consistent with and draw from guidelines provided by scientific and professional organizations, advisory committees, and have emerged in consultation with the field (e.g., Dettmer, Taylor, and Chhin, 2017; Subcommittee on Replicability and Science, 2015). The principles address three main areas – (1) replication and reproducibility at the proposal stage, (2) promoting transparency and openness in designing studies, and (3) considerations in the reporting of results. 

Although the importance of reproducibility and replication studies for advancing scientific knowledge has been widely acknowledged, there are several challenges for researchers in our field, including actual or perceived disincentives (e.g., publication bias; reputation and career advancement norms; emphases on novel, potentially transformative lines of inquiry), implementation difficulties (especially for direct replications), and complexities of interpreting results (e.g., lack of consensus on what it means to “replicate” findings, low statistical power for replications). Grant funding agencies such as IES and NSF as well as education researchers have a role to play in addressing these challenges, promoting reproducibility and replication studies, and ultimately moving the field forward.

Why focus on replication and reproducibility?

The original Common Guidelines document did not substantively address issues pertaining to replication and reproducibility of research.  Given the interest in and importance of this work, IES and NSF are providing additional clarity to the field in terms of common definitions and principles around replication and reproducibility.

Who is the audience for the Companion Guidelines on Replication and Reproducibility? 

The primary audience for this document is education researchers; however, education research funding agencies and reviewers of grant applications are additional audiences for this document.

How should this document be used by researchers intending to apply for grants to conduct a reproducibility or replication study?

This document is meant to highlight the importance of replication and reproducibility studies and to offer guidelines to education stakeholders for thinking about and promoting reproducibility and replication in education research. It does not supersede the guidance provided in the requests for applications provided by IES and NSF. 

What are the guiding principles for proposing replication and reproducibility studies?

The overarching principles at the grant proposal stage are as follows:

  1. Clarify how reproducibility or replication studies would build on prior studies and contribute to the knowledge base.
  2. Clearly specify any variations from prior studies and the rationale for such variations.
  3. Ensure objectivity (e.g., by conducting an independent investigation, or by putting safeguards in place if the original investigator(s) is involved).

In addition to these principles, the document also lays out principles for promoting transparency, open science, and reporting results.

Read the full Companion Guidelines here.

IES logoNational Science Foundation logo

 

Building Evidence: Changes to the IES Goal Structure for FY 2019

The IES Goal Structure was created to support a continuum of education research that divides the research process into stages for both theoretical and practical purposes. Individually, the five goals – Exploration (Goal 1), Development and Innovation (Goal 2), Efficacy and Replication (Goal 3), Effectiveness (Goal 4), and Measurement (Goal 5) – were intended to help focus the work of researchers, while collectively they were intended to cover the range of activities needed to build evidence-based solutions to the most pressing education problems in our nation. Implicit in the goal structure is the idea that over time, researchers will identify possible strategies to improve student outcomes (Goal 1), develop and pilot-test interventions (Goal 2), and evaluate the effects of interventions with increasing rigor (Goals 3 and 4).

Over the years, IES has received many applications and funded a large number of projects under Goals 1-3.  In contrast, IES has received relatively few applications and awarded only a small number of grants under Goal 4. To find out why – and to see if there were steps IES could take to move more intervention studies through the evaluation pipeline – IES hosted a Technical Working Group (TWG) meeting in 2016 to hear views from experts on what should come after an efficacy study (see the relevant summary and blog post). IES also issued a request for public comment on this question in July 2017 (see summary).

The feedback we received was wide-ranging, but there was general agreement that IES could do more to encourage high-quality replications of interventions that show prior evidence of efficacy. One recommendation was to place more emphasis on understanding “what works for whom” under various conditions.  Another comment was that IES could provide support for a continuum of replication studies.  In particular, some commenters felt that the requirements in Goal 4 to use an independent evaluator and to carry out an evaluation under routine conditions may not be practical or feasible in all cases, and may discourage some researchers from going beyond Goal 3.   

In response to this feedback, IES revised its FY 2019 RFAs for Education Research Grants (84.305A) and Special Education Research Grants (84.324A) to make clear its interest in building more and better evidence on the efficacy and effectiveness of interventions. Among the major changes are the following:

  • Starting in FY 2019, Goal 3 will continue to support initial efficacy evaluations of interventions that have not been rigorously tested before, in addition to follow-up and retrospective studies.
  • Goal 4 will now support all replication studies of interventions that show prior evidence of efficacy, including but not limited to effectiveness studies.
  • The maximum amount of funding that may be requested under Goal 4 is higher to support more in-depth work on implementation and analysis of factors that moderate or mediate program effects.

The table below summarizes the major changes. We strongly encourage potential applicants to carefully read the RFAs (Education Research, 84.305A and Special Education Research, 84.324A) for more details and guidance, and to contact the relevant program officers with questions (contact information is in the RFA).

Applications are due August 23, 2018 by 4:30:00 pm Washington DC time.

 

Name Change

Focus Change

Requirements Change

Award Amount Change

Goal 3

Formerly “Efficacy and Replication;” in FY2019, “Efficacy and Follow-Up.”

Will continue to support initial efficacy evaluations of interventions in addition to follow-up and retrospective studies.

No new requirements.

No change.

Goal 4

Formerly “Effectiveness;” in FY2019, “Replication: Efficacy and Effectiveness.”

Will now support all replications evaluating the impact of an intervention. Will also support Efficacy Replication studies and Re-analysis studies.

Now contains a requirement to describe plans to conduct analyses related to implementation and analysis of key moderators and/or mediators. (These were previously recommended.)

Efficacy Replication studies maximum amount: $3,600,000.

Effectiveness studies maximum amount: $4,000,000.

Re-analysis studies maximum amount: $700,000.

 

 

By Thomas Brock (NCER Commissioner) and Joan McLaughlin (NCSER Commissioner)