Skip Navigation

We Need Your Feedback on Research Topics: Who Gets Voted Off the Island?

Mark Schneider, Director of IES   |   August 13, 2019

I know that many of you are busy responding to the current crop of RFAs. But I thought I would bring you up to date on some things in the works at IES that may affect future research support. IES needs feedback from the field on these issues relatively quickly if—as we intend—your advice affects how we frame next year’s grants.

Short-term “off-cycle” competitions

We are considering whether to release three topic specific RFAs in the middle of next winter.

  1. Using state longitudinal data systems to measure long-term outcomes.
    One of the principles in our Standards for Excellence in Education Research (SEER) is the need to measure long-term outcomes. Through IES, the federal government has made a large investment in state longitudinal data systems (SLDS). This NCES/NCER RFA would ask researchers to use SLDS to identify students who were in a “treatment” group and track their educational progress. The problems are as big as they are obvious. Our goal is to incentivize researchers to consider the possibilities of using these data systems to measure long-term outcomes of past interventions.
  2. Using NAEP process data.
    Since 2015, NAEP has captured (and timed) every key stroke that students make while taking assessments. Couple these process data with the wealth of data NAEP background questionnaires capture, and IES believes we are at the cusp of a breakthrough in understanding testing and student test scores. This RFA would be a joint project between NAEP and NCSER. In addition to knowing how students with disabilities score on NAEP, we want to know the processes these students use to arrive at their answers, especially in comparison to their peers without disabilities. As an aside: These NAEP data are massive and likely require analytic skills beyond what many educational researchers already have. Indeed, we are hoping that the richness of these data—and the challenges they present—will attract data scientists to our research program.
  3. Systematic evaluation of widely used math and reading programs.
    Good science requires replication. This RFA would build on the Systematic Replication RFAs, where we asked for proposals focused on any of 17 IES-funded math and reading interventions that have evidence of efficacy. While relatively few students have used most of these 17 treatments, other reading and math “interventions” used by millions of students often have limited evidence of their effectiveness. We are considering an RFA that asks researchers to systematically test one of the 10 or so most widely used reading or math programs to help identify which of these programs work for whom. IES has supported two efficacy tests of widely used curricula: the National Randomized Control Trial of Everyday Mathematics and the National Randomized Controlled Trial Study of SRA/McGraw-Hill Open-Court Reading Program. These studies provide some useful information about how similar work could be structured.

To reiterate, we are considering these RFAs and would like your feedback. I think the problems facing research in each of them (especially with SLDS) are clear, but are they surmountable?

That’s my first “ask.” Now to the second.

Revisiting topic areas

The research program at IES is structured around a matrix in which NCER has 13 topic areas (“verticals”)and NCSER has 12. These topics were traditionally crossed with 5 horizontal “goals.” In this year’s RFAs, we renamed and simplified the goal structure and simplified the topic descriptions, hopefully allowing researchers more freedom to propose innovative ideas.

It’s now time to more fully consider the verticals.

Some of these topics (reading, writing, math) are core to the education research enterprise and ESRA requires NCER to support research in those areas. Others have been added to the RFAs to address issues raised at a specific moment in time. Times change. Should any of the verticals be sent off to a well-deserved retirement?

Perhaps of greater importance is the flip side: what are we missing? We can imagine tinkering on the edges, adding new “special topics,” or running off-cycle competitions (like the three I described above), but is it time for a more thorough rethink?

I met with the Friends of IES recently and asked the 15 or so participants to give me their feedback on these two questions. We are planning some meetings to get feedback from the field. We may also issue a formal Request for Information (RFI).

We are hoping that this blog will mobilize the research community to think about this challenge and reach out with their ideas. You can send your comments to me:

Mark Schneider
Director of IES
Mark.Schneider@ed.gov