Inside IES Research

Notes from NCER & NCSER

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

It All Adds Up: Why and How to Measure the Cost of Career & Technical Education

Cost analysis is a critical part of education research because it communicates what resources are needed for a particular program or intervention. Just telling education leaders how promising a program or practice can be does not tell the whole story; they need to know how much it will cost so that they can prioritize limited resources. Since 2015, cost analysis has been required for IES-funded Efficacy/Impact studies (and for Development Innovation studies as of 2019) and is included in the IES Standards for Excellence in Education Research.

In this guest blog for CTE Month, two members of the CTE Research Network’s cost analysis working group, David Stern, an advisor to the network, and Eric Brunner, a co-PI of one of the research teams, discuss how costs associated with CTE programs may differ from those of standard education and how to measure those costs.

Photo of David SternWhy is cost analysis different in Career & Technical Education (CTE) research?

Due to additional, non-standard components needed in some types of career training, CTE can cost much more than the education resources needed in regular classrooms. For instance, CTE classes often use specialized equipment—for example, hydraulic lifts in automotive mechanics, stoves and refrigerators in culinary arts, or medical equipment in health sciences—which costs significantly more than equipment in the standard classroom. Having specialized equipment for student use can also constrain class size to be smaller, resulting in higher cost-per-pupil.  High schools and community colleges may also build labs within existing buildings or construct separate buildings to house CTE programs with specialized equipment. These required facility expenses will need to be recognized in cost calculations.

CTE programs can also provide co-curricular experiences for students alongside classes in career-related subjects, such as work-based learning, career exploration activities, or integrated academic coursework. Schools are usually required to provide transportation for students to workplaces, college campuses for field trips, or regional career centers, which is another expense. Finally, the budget lines for recruiting and retaining teachers from some higher paying career areas and industries (such as nursing or business) may exceed those for average teacher salaries. All of these costs add up. To provide useful guidance for the field, CTE researchers should measure and report the cost of these features separately.

Photo of Eric BrunnerHow is resource cost different from reported spending? 

There are also some hidden costs to account for in research on CTE. For example, suppose a school does not have a work-based learning (WBL) coordinator, so a CTE teacher is allowed one of their 5 periods each day to organize and oversee WBL, which may include field trips to companies, job shadowing experiences, internships, or a school-based enterprise. The expenditure report would show 20% of the teacher’s salary has been allocated for that purpose. In reality, however, a teacher may devote much more than 20% of their time to this. They may in fact be donating to the program by spending unpaid time or resources (such as transportation in their own vehicle to visit employer sites to coordinate learning plans) outside the workday. It is also possible that the teacher would spend less than 20% of their time on this. To obtain an accurate estimate of the amount of this resource cost at a particular school, a researcher would have to measure how much time the teacher actually spends on WBL.  This could be done as part of an interview or questionnaire.

Similarly, high school CTE programs are increasingly being developed as pathways that allow students to move smoothly to postsecondary education, such as via dual enrollment programs or directly to the labor market. Building and sustaining these pathways takes active collaboration between secondary and postsecondary educators and employers. However, the costs of these collaborations in terms of time and resources are unlikely to be found in a school expenditure report. Thus, an incremental cost analysis for CTE pathway programs must go beyond budgets and expenditure reports to interview or survey program administrators and staff about the resources or “ingredients” that programs require to operate. A recent example of a cost study of a CTE program can be found here.

Are there any resources for calculating CTE Costs?

In this blog, we have presented some examples of how the costs associated with CTE programs may differ from those of a standard education. To help CTE researchers conduct cost analysis, the CTE Research Network has developed a guide to measuring Incremental Costs in Career and Technical Education, which explains how to account for the particular kinds of resources used in CTE. The guide was developed by the working group on cost analysis supported by the CTE Research Network.


The Career and Technical Education (CTE) Research Network has supported several cross-network working groups comprised of members of network research teams and advisors working on issues of broad interest to CTE research. Another CTE Network working group developed an equity framework for CTE researchers, which was described in a blog for CTE month in February, 2023.

This blog was produced by Corinne Alfeld, NCER program officer for the CTE research topic and the CTE Research Network. Contact: Corinne.Alfeld@ed.gov.

Have a Cost Analysis to Plan or Execute? We Have a Module for That

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team.

Analyzing an intervention’s costs is one of IES’s nine SEER principles. Cost analysis is not just about the dollar value of an intervention; it provides key information to education decision-makers about the personnel, materials, facilities, and other inputs needed to implement an intervention or policy with fidelity. But planning and executing any kind of economic evaluation, such as a cost analysis or cost-effectiveness analysis, involves many steps.

The IES-funded Cost Analysis in Practice Project (CAP Project) has developed a series of five free, online modules on cost analysis. Each module includes a sequence of short videos (3‑17 minutes each) and resources to facilitate each of the 4 main stages of a cost analysis: study design, data collection, data analysis, and reporting (register here for the CAP Project online modules).

The modules are timely for anyone submitting a grant application to the IES FY 2024 grant programs that require a cost analysis. In addition, cost studies are included in the Education Innovation and Research (EIR) Mid-phase or Expansion grants. For your grant application, you’ll likely only need parts of Modules 1 and 2, Introduction to Cost Analysis and Designing a Cost Analysis. You can save the rest for when you receive a grant.

You should review the IES Request for Applications (RFA) to determine what kind of economic evaluation, if any, is required for your IES application. You can also review the CAP Project’s RFA requirements chart, which summarizes our take on what is required and what is recommended for each IES RFA. If your grant application does not require a cost analysis but you want to include one, we created a flowchart to help you decide which type of evaluation might make sense for your situation: see Module 1 Video 2b. We also provide a brief example of each kind of economic evaluation in Module 1 Video 3. 

If cost analysis is new to you, Module 1 Video 1 explains what “costs” really are. Module 1 Video 2a introduces the ingredients method and a demonstration of why it’s important to differentiate between economic costs and expenditures. Module 1 Video 4 walks you through the four stages of a cost analysis and points out when to use specific CAP Project resources such as our Checklist for Cost Analysis Plans, Timeline of Activities for Cost Analysis, and Cost Analysis Templates (the “CAPCATs”). If you prefer reading to watching videos, our Cost Analysis Standards & Guidelines cover this ground in more depth.

When you’re ready to plan your cost or cost-effectiveness analysis, head to Module 2. The introductory video (Module 2 Video 1) discusses a few critical decisions you need to make early on that will affect how much of your study budget should be dedicated to the economic evaluation—no one likes surprises there. Module 2 Videos 2 and 3 walk you through the design of an economic evaluation, illustrating each design feature using Reading Recovery as an example. Module 2 Video 4 presents a few scenarios to help you think about which costs you will estimate and how the costs of the intervention you plan to study compare to the costs of business as usual. Module 2 Video 5 reviews a timeline and key activities for each stage of your economic evaluation. The content in Modules 1 and 2 should help you develop a robust plan for an economic evaluation so that you’ll be all set to begin the study as soon as you are funded.

Modules 3-5 cover data collection, analysis, and reporting. You may want to skim these now, or at least watch the brief introductory videos for an overview of what’s in store for you and your cost analyst. These modules can help you execute your cost study.


Fiona Hollands is the Founder & Managing Director of EdResearcher. She studies the effectiveness and costs of educational programs with the goal of helping education practitioners and policymakers optimize the use of resources in education to promote better student outcomes.

Jaunelle Pratt-Williams is a Senior Research Scientist at NORC at the University of Chicago. She leads economic evaluations and mixed-methods policy research studies to improve the educational opportunities for historically underserved students.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

Calculating the Costs of School Internet Access

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

Internet access has become an indispensable element of many education and social programs. However, researchers conducting cost analyses of education programs often don’t capture these costs due to lack of publicly available information on what school districts pay for internet service. EducationSuperHighway, a nonprofit organization, now collects information about the internet bandwidth and monthly internet costs for each school district in the United States. The information is published on the Connect K-12 website. While Connect K-12 provides a median cost per Mbps in schools nationwide, its applicability in cost analyses is limited. This is because the per student cost varies vastly depending on the school district size.

As customers, we often save money by buying groceries in bulk. One of the reasons that larger sizes offer better value is that the ingredient we consume is sometimes only a small part of the total cost of the whole product; the rest of the cost goes into the process that makes the product accessible, such as packaging, transportation, and rent.

Same thing with internet. To make internet available in schools, necessary facilities and equipment include, but are not limited to web servers, ethernet cables, and Wi-Fi routers. Large school districts, which are often in urban locations, usually pay much less per student than small districts, which are often in rural areas. Costs of infrastructural adaptations need to be considered when new equipment and facilities are required for high-speed internet delivery. Fiber-optic and satellite internet services have high infrastructural costs. While old-fashioned DSL internet uses existing phone lines and thus has less overhead cost, it's much slower, often making it difficult to meet the current Federal Communications Commission recommended bandwidth of 1 Mbps per student.

In short, there is no one-price-for-all when it comes to costs of school internet access. To tackle this challenge, we used the data available on Connect K-12 for districts in each of the 50 U.S. states to calculate some useful metrics for cost analyses. First, we categorized the districts with internet access according to MDR’s definition of small, medium, and large school districts (Small: 0-2,499 students; Medium: 2,500-9,999 students; Large: 10,000+ students). For each category, we calculated the following metrics which are shown in Table 1:

  1. median cost per student per year
  2. median cost per student per hour

 

Table 1: Internet Access Costs

District size

(# of students)

Median mbps per student per month

Median cost per mbps per month

Median cost per student per month

Cost per student per year

Cost per student per hour

Small (0-2,499)

1.40

$1.75

$2.45

$29.40

$0.02

Medium (2,500-9,999)

0.89

$0.95

$0.85

$10.15

$0.007

Large (10,000+)

0.83

$0.61

$0.50

$6.03

$0.004

National median

1.23

$1.36

$1.67

$20.07

$0.014

 

Note: Cost per student per hour is computed based on the assumption that schools open for 1,440 hours (36 weeks) per annum, e.g., for a small district the cost per student per hour is $29.40/1,440 = $0.02). See methods here.

 

Here’s an example of how you might determine an appropriate portion of the costs to attribute to a specific program or practice:  

Sunnyvale School is in a school district of 4,000 students. It offers an afterschool program in the library in which 25 students work online with remote math tutors. The program runs for 1.5 hours per day on 4 days per week for 36 weeks. Internet costs would be:

 

1.5 hours x 4 days x 36 weeks x 25 students x $0.007 = $37.80.

 

The cost per student per hour might seem tiny. Take New York City Public Schools, for example, the cost per Mbps per month is $0.13, and yet the district pays $26,000 each month for internet. For one education program or intervention, internet costs may sometimes represent only a small fraction of the overall costs and may hardly seem worth estimating in comparison to personnel salaries and fringe benefits. However, it is critical for a rigorous cost analysis study to identify all the resources needed to implement a program.


Yuan Chang is a research assistant in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher on the CAP Project.

 Anna Kushner is a doctoral student in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher for the CAP Project.

Using Cost Analysis to Inform Replicating or Scaling Education Interventions

A key challenge when conducting cost analysis as part of an efficacy study is producing information that can be useful for addressing questions related to replicability or scale. When the study is a follow up conducted many years after the implementation, the need to collect data retrospectively introduces additional complexities. As part of a recent follow-up efficacy study, Maya Escueta and Tyler Watts of Teachers College, Columbia University worked with the IES-funded Cost Analysis in Practice (CAP) project team to plan a cost analysis that would meet these challenges. This guest blog describes their process and lessons learned and provides resources for other researchers.

What was the intervention for which you estimated costs retrospectively?

We estimated the costs of a pre-kindergarten intervention, the Chicago School Readiness Project (CSRP), which was implemented in nine Head Start Centers in Chicago, Illinois for two cohorts of students in 2004-5 and 2005-6. CSRP was an early childhood intervention that targeted child self-regulation by attempting to overhaul teacher approaches to behavioral management. The intervention placed licensed mental health clinicians in classrooms, and these clinicians worked closely with teachers to reduce stress and improve the classroom climate. CSRP showed signs of initial efficacy on measures of preschool behavioral and cognitive outcomes, but more recent results from the follow-up study showed mainly null effects for the participants in late adolescence.

The IES research centers require a cost study for efficacy projects, so we faced the distinct challenge of conducting a cost analysis for an intervention nearly 20 years after it was implemented. Our goal was to render the cost estimates useful for education decision-makers today to help them consider whether to replicate or scale such an intervention in their own context.

What did you learn during this process?

When enumerating costs and considering how to implement an intervention in another context or at scale, we learned four distinct lessons.

1. Consider how best to scope the analysis to render the findings both credible and relevant given data limitations.

In our case, because we were conducting the analysis 20 years after the intervention was originally implemented, the limited availability of reliable data—a common challenge in retrospective cost analysis—posed two challenges. We had to consider the data we could reasonably obtain and what that would mean for the type of analysis we could credibly conduct. First, because no comprehensive cost analysis was conducted at the time of the intervention’s original implementation (to our knowledge), we could not accurately collect costs on the counterfactual condition. Second, we also lacked reliable measures of key outcomes over time, such as grade retention or special education placement that would be required for calculating a complete cost-benefit analysis. This meant we were limited in both the costs and the effects we could reliably estimate. Due to these data limitations, we could only credibly conduct a cost analysis, rather than a cost-effectiveness analysis or cost-benefit analysis, which generally produce more useful evidence to aid in decisions about replication or scale.

Because of this limitation, and to provide useful information for decision-makers who are considering implementing similar interventions in their current contexts, we decided to develop a likely present-day implementation scenario informed by the historical information we collected from the original implementation. We’ll expand on how we did this and the decisions we made in the following lessons.

2. Consider how to choose prices to improve comparability and to account for availability of ingredients at scale.

We used national average prices for all ingredients in this cost analysis to make the results more comparable to other cost analyses of similar interventions that also use national average prices. This involved some careful thought about how to price ingredients that were unique to the time or context of the original implementation, specific to the intervention, or in low supply. For example, when identifying prices for personnel, we either used current prices (national average salaries plus fringe benefits) for personnel with equivalent professional experience, or we inflation-adjusted the original consulting fees charged by personnel in highly specialized roles. This approach assumes that personnel who are qualified to serve in specialized roles are available on a wider scale, which may not always be the case.

In the original implementation of CSRP, spaces were rented for teacher behavior management workshops, stress reduction workshops, and initial training of the mental health clinicians. For our cost analysis, we assumed that using available school facilities were more likely and tenable when implementing CSRP at large scale. Instead of using rental prices, we valued the physical space needed to implement CSRP by using amortized construction costs of school facilities (for example, cafeteria/gym/classroom). We obtained these from the CAP Project’s Cost of Facilities Calculator.

3. Consider how to account for ingredients that may not be possible to scale.

Some resources are simply not available in similar quality at large scale. For example, the Principal Investigator (PI) for the original evaluation oversaw the implementation of the intervention, was highly invested in the fidelity of implementation, was willing to dedicate significant time, and created a culture that was supportive of the pre-K instructors to encourage buy-in for the intervention. In such cases, it is worth considering what her equivalent role would be in a non-research setting and how scalable this scenario would be. A potential proxy for the PI in this case may be a school principal or leader, but how much time could this person reasonably dedicate, and how similar would their skillset be?  

4. Consider how implementation might work in institutional contexts required for scale.

Institutional settings might necessarily change when taking an intervention to scale. In larger-scale settings, there may be other ways of implementing the intervention that might change the quantities of personnel and other resources required. For example, a pre-K intervention such as CSRP at larger scale may need to be implemented in various types of pre-K sites, such as public schools or community-based centers in addition to Head Start centers. In such cases, the student/teacher ratio may vary across different institutional contexts, which has implications for the per-student cost. If delivered in a manner where the student/ teacher ratio is higher than in the original implementation, the intervention may be less costly, but may also be less impactful. This highlights the importance of the institutional setting in which implementation is occurring, and how this might affect the use and costs of resources.

How can other researchers get assistance in conducting a cost analysis?

In conducting this analysis, we found the following CAP Project tools to be especially helpful (found on the CAP Resources page and the CAP Project homepage):

  • The Cost of Facilities Calculator: A tool that helps estimate the cost of physical spaces (facilities).
  • Cost Analysis Templates: Semi-automated Excel templates that support cost analysis calculations.
  • CAP Project Help Desk: Real-time guidance from a member of the CAP Project team. You will receive help in troubleshooting challenging issues with experts who can share specific resources. Submit a help desk request by visiting this page.

Maya Escueta is a Postdoctoral Associate in the Center for Child and Family Policy at Duke University where she researches the effects of poverty alleviation policies and parenting interventions on the early childhood home environment.

Tyler Watts is an Assistant Professor in the Department of Human Development at Teachers College, Columbia University. His research focuses on the connections between early childhood education and long-term outcomes.

For questions about the CSRP project, please contact the NCER program officer, Corinne.Alfeld@ed.gov. For questions about the CAP project, contact Allen.Ruby@ed.gov.