August 20 was the deadline for this year's grant applications. I want to recognize the extraordinary efforts that education researchers expended to meet this deadline. Faced with the seemingly endless demands (and fears) engendered by the pandemic, the education science community came through with about the same number of proposals as other years. Congratulations (or should it be commiserations?) and thanks!
This is the second in a series of blogs I am writing about SEER. When IES started down the SEER path, I didn't fully recognize how foundationally important they would be. In 2002, under the direction of Russ Whitehurst, IES launched a revolution in the education sciences based on increasing rigor and a heavy reliance on RCTs for testing the impact of interventions. IES won that revolution. For me, SEER consolidates that victory and lays the foundation for the next wave of rigorous education science.
Among SEER principles, cost analysis arguably presents the greatest challenge to a field in which economists are few and far between and cost analysis training is rare. In contrast, RCTs, as revolutionary as they were for the education sciences, were familiar to the many psychologists and social scientists involved with education research.
IES was already increasing its emphasis on cost before I assumed office in May 2017. Under my direction, the RFAs issued that spring ratcheted up the emphasis on cost analysis—and in the following year made it a requirement for many more of our grants. The logic behind this push is simple: What use is it for us to tell educators which interventions are effective in education without also informing them about the resources required to implement them?
The challenge to the field can be seen in a couple of simple numbers: The first year we expanded the requirements, 25% of NCER and 7% of NCSER grant applications that were required to include cost and/or cost-effectiveness analysis were deemed non-responsive and, therefore, not reviewed.
Here's my version of a quote attributed to Samuel Johnson: "Nothing concentrates the mind as much as a hanging." Well, after so many proposals were rejected (and after IES staff repeatedly stated that our commitment to cost analysis was for real), the field responded—last year only 7% of NCER and 2% of NCSER applications that were required to include such analyses were deemed non-responsive because of the failure to deal with costs.
The increased responsiveness is satisfying, but we recognize our continuing responsibility for providing technical assistance to support high-quality cost analysis. Below I highlight a few of the projects we stood up to help researchers prepare this year's applications and which will continue to support the field in executing cost analysis over the next few years.
Together, the projects combine person-to-person training and support with a growing body of training material and guides that can be accessed on demand. However, this blog is not just about what we did to support this year's competitions; below is a specific ask about what we need from you to plan for next year. But first, I want to highlight three projects that IES recently funded specifically to help develop the capacity of education researchers to conduct cost analysis. I also provide a brief description of a panel on cost analysis standards that AIR is running, the results of which we are looking forward to.
Cost Analysis: A Starter Kit is designed for grant applicants who are new to cost analysis. The kit helps applicants understand and design a cost analysis, setting the foundation for more complex economic analyses. This is accompanied by an Excel Spreadsheet to help structure the actual cost analysis work. The kit has been downloaded close to 500 times since March and the spreadsheet around 200 times.
The Cost Analysis in Practice (CAP) Project, led by Fiona Hollands (Teachers College, Columbia University) and Jaunelle Pratt-Williams (SRI), provides free, on-demand tools, guidance, and technical assistance to researchers and practitioners who are planning or conducting cost analysis for IES-funded work (non-IES researchers can also tap these resources but clearly the emphasis is on IES work). As of now, it has a highly accessible guidebook called Cost Analysis Standards & Guidelines, an informative infographic, and the first of many planned videos designed to ease entry into the world of cost analysis. Since June, CAP Project has provided one-on-one Help Desk support to almost 40 IES grant applicants and has had well over one thousand unique visitors to its website.
The IES Methods Training in Cost-Effectiveness and Economic Evaluation is headed by Brooks Bowden (University of Pennsylvania) in partnership with Henry Levin (Teachers College, Columbia University), Clive Belfield (Queens College, CUNY), and Robert Shand (American University). This project will provide training to traditional education researchers and to state/local analysts. Each summer, the program will serve cohorts of 30 researchers in a week-long session and 30 analysts in a three-day session. The program will also develop online courses and training material. In time for this year's proposal deadline, Brooks and her team produced three briefs on how to include cost work in efficacy/effectiveness trials, development pilots, and measurement projects. These will be supplemented with additional training briefs over the course of the grant.
Finally, in July, AIR launched the Cost Analysis Standards Project Panel (the CASP Panel) to tap into leading experts in economic evaluation to recommend standards for guiding and judging cost work in education research. The standards will be reviewed and evaluated in November, and the final product should be ready by mid-December. This is an AIR project, not funded by IES. However, we are looking forward to their report to see the extent to which we can use their standards in next year's RFAs.
There will be overlap between the products these projects create—but redundancy is not the same thing as inefficiency. We are hoping that, as these projects develop, we will find the best approaches and products to support the growth of these essential skills. As part of this effort, we will be asking Fiona and Brooks to share with us lessons learned from their interactions with the field. Clearly, we need to understand where and how our RFAs must be made clearer and we need to know what holes need to be filled in the support we are providing.
We have already learned that researchers need to integrate the collection of cost data with the collection of fidelity of implementation data, since there is substantial overlap. Another lesson is that researchers should not treat cost analysis as an "add-on" to be outsourced to a consultant. Rather, cost analysis is, and will continue to be, a core component of work that IES will fund. Researchers need to develop the skills to do this work.
IES will keep working to make our RFAs clearer; we will develop clearer standards for the evaluation of applications; and we will structure our technical assistance to support those standards.
Finally, here is an ask: E-mail me directly at Mark.Schneider@ed.gov (yes, I read my e-mail) with any comments you have on cost analysis. Include reflections on your experiences with cost work to date and the kinds of support that would help you do your work better, especially if you were part of this year's competitions and have feedback on any of the resources noted above.