IES Blog

Institute of Education Sciences

It All Adds Up: Why and How to Measure the Cost of Career & Technical Education

Cost analysis is a critical part of education research because it communicates what resources are needed for a particular program or intervention. Just telling education leaders how promising a program or practice can be does not tell the whole story; they need to know how much it will cost so that they can prioritize limited resources. Since 2015, cost analysis has been required for IES-funded Efficacy/Impact studies (and for Development Innovation studies as of 2019) and is included in the IES Standards for Excellence in Education Research.

In this guest blog for CTE Month, two members of the CTE Research Network’s cost analysis working group, David Stern, an advisor to the network, and Eric Brunner, a co-PI of one of the research teams, discuss how costs associated with CTE programs may differ from those of standard education and how to measure those costs.

Photo of David SternWhy is cost analysis different in Career & Technical Education (CTE) research?

Due to additional, non-standard components needed in some types of career training, CTE can cost much more than the education resources needed in regular classrooms. For instance, CTE classes often use specialized equipment—for example, hydraulic lifts in automotive mechanics, stoves and refrigerators in culinary arts, or medical equipment in health sciences—which costs significantly more than equipment in the standard classroom. Having specialized equipment for student use can also constrain class size to be smaller, resulting in higher cost-per-pupil.  High schools and community colleges may also build labs within existing buildings or construct separate buildings to house CTE programs with specialized equipment. These required facility expenses will need to be recognized in cost calculations.

CTE programs can also provide co-curricular experiences for students alongside classes in career-related subjects, such as work-based learning, career exploration activities, or integrated academic coursework. Schools are usually required to provide transportation for students to workplaces, college campuses for field trips, or regional career centers, which is another expense. Finally, the budget lines for recruiting and retaining teachers from some higher paying career areas and industries (such as nursing or business) may exceed those for average teacher salaries. All of these costs add up. To provide useful guidance for the field, CTE researchers should measure and report the cost of these features separately.

Photo of Eric BrunnerHow is resource cost different from reported spending? 

There are also some hidden costs to account for in research on CTE. For example, suppose a school does not have a work-based learning (WBL) coordinator, so a CTE teacher is allowed one of their 5 periods each day to organize and oversee WBL, which may include field trips to companies, job shadowing experiences, internships, or a school-based enterprise. The expenditure report would show 20% of the teacher’s salary has been allocated for that purpose. In reality, however, a teacher may devote much more than 20% of their time to this. They may in fact be donating to the program by spending unpaid time or resources (such as transportation in their own vehicle to visit employer sites to coordinate learning plans) outside the workday. It is also possible that the teacher would spend less than 20% of their time on this. To obtain an accurate estimate of the amount of this resource cost at a particular school, a researcher would have to measure how much time the teacher actually spends on WBL.  This could be done as part of an interview or questionnaire.

Similarly, high school CTE programs are increasingly being developed as pathways that allow students to move smoothly to postsecondary education, such as via dual enrollment programs or directly to the labor market. Building and sustaining these pathways takes active collaboration between secondary and postsecondary educators and employers. However, the costs of these collaborations in terms of time and resources are unlikely to be found in a school expenditure report. Thus, an incremental cost analysis for CTE pathway programs must go beyond budgets and expenditure reports to interview or survey program administrators and staff about the resources or “ingredients” that programs require to operate. A recent example of a cost study of a CTE program can be found here.

Are there any resources for calculating CTE Costs?

In this blog, we have presented some examples of how the costs associated with CTE programs may differ from those of a standard education. To help CTE researchers conduct cost analysis, the CTE Research Network has developed a guide to measuring Incremental Costs in Career and Technical Education, which explains how to account for the particular kinds of resources used in CTE. The guide was developed by the working group on cost analysis supported by the CTE Research Network.


The Career and Technical Education (CTE) Research Network has supported several cross-network working groups comprised of members of network research teams and advisors working on issues of broad interest to CTE research. Another CTE Network working group developed an equity framework for CTE researchers, which was described in a blog for CTE month in February, 2023.

This blog was produced by Corinne Alfeld, NCER program officer for the CTE research topic and the CTE Research Network. Contact: Corinne.Alfeld@ed.gov.

Have a Cost Analysis to Plan or Execute? We Have a Module for That

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team.

Analyzing an intervention’s costs is one of IES’s nine SEER principles. Cost analysis is not just about the dollar value of an intervention; it provides key information to education decision-makers about the personnel, materials, facilities, and other inputs needed to implement an intervention or policy with fidelity. But planning and executing any kind of economic evaluation, such as a cost analysis or cost-effectiveness analysis, involves many steps.

The IES-funded Cost Analysis in Practice Project (CAP Project) has developed a series of five free, online modules on cost analysis. Each module includes a sequence of short videos (3‑17 minutes each) and resources to facilitate each of the 4 main stages of a cost analysis: study design, data collection, data analysis, and reporting (register here for the CAP Project online modules).

The modules are timely for anyone submitting a grant application to the IES FY 2024 grant programs that require a cost analysis. In addition, cost studies are included in the Education Innovation and Research (EIR) Mid-phase or Expansion grants. For your grant application, you’ll likely only need parts of Modules 1 and 2, Introduction to Cost Analysis and Designing a Cost Analysis. You can save the rest for when you receive a grant.

You should review the IES Request for Applications (RFA) to determine what kind of economic evaluation, if any, is required for your IES application. You can also review the CAP Project’s RFA requirements chart, which summarizes our take on what is required and what is recommended for each IES RFA. If your grant application does not require a cost analysis but you want to include one, we created a flowchart to help you decide which type of evaluation might make sense for your situation: see Module 1 Video 2b. We also provide a brief example of each kind of economic evaluation in Module 1 Video 3. 

If cost analysis is new to you, Module 1 Video 1 explains what “costs” really are. Module 1 Video 2a introduces the ingredients method and a demonstration of why it’s important to differentiate between economic costs and expenditures. Module 1 Video 4 walks you through the four stages of a cost analysis and points out when to use specific CAP Project resources such as our Checklist for Cost Analysis Plans, Timeline of Activities for Cost Analysis, and Cost Analysis Templates (the “CAPCATs”). If you prefer reading to watching videos, our Cost Analysis Standards & Guidelines cover this ground in more depth.

When you’re ready to plan your cost or cost-effectiveness analysis, head to Module 2. The introductory video (Module 2 Video 1) discusses a few critical decisions you need to make early on that will affect how much of your study budget should be dedicated to the economic evaluation—no one likes surprises there. Module 2 Videos 2 and 3 walk you through the design of an economic evaluation, illustrating each design feature using Reading Recovery as an example. Module 2 Video 4 presents a few scenarios to help you think about which costs you will estimate and how the costs of the intervention you plan to study compare to the costs of business as usual. Module 2 Video 5 reviews a timeline and key activities for each stage of your economic evaluation. The content in Modules 1 and 2 should help you develop a robust plan for an economic evaluation so that you’ll be all set to begin the study as soon as you are funded.

Modules 3-5 cover data collection, analysis, and reporting. You may want to skim these now, or at least watch the brief introductory videos for an overview of what’s in store for you and your cost analyst. These modules can help you execute your cost study.


Fiona Hollands is the Founder & Managing Director of EdResearcher. She studies the effectiveness and costs of educational programs with the goal of helping education practitioners and policymakers optimize the use of resources in education to promote better student outcomes.

Jaunelle Pratt-Williams is a Senior Research Scientist at NORC at the University of Chicago. She leads economic evaluations and mixed-methods policy research studies to improve the educational opportunities for historically underserved students.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

Calculating the Costs of School Internet Access

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team to discuss practical details regarding cost studies.

Internet access has become an indispensable element of many education and social programs. However, researchers conducting cost analyses of education programs often don’t capture these costs due to lack of publicly available information on what school districts pay for internet service. EducationSuperHighway, a nonprofit organization, now collects information about the internet bandwidth and monthly internet costs for each school district in the United States. The information is published on the Connect K-12 website. While Connect K-12 provides a median cost per Mbps in schools nationwide, its applicability in cost analyses is limited. This is because the per student cost varies vastly depending on the school district size.

As customers, we often save money by buying groceries in bulk. One of the reasons that larger sizes offer better value is that the ingredient we consume is sometimes only a small part of the total cost of the whole product; the rest of the cost goes into the process that makes the product accessible, such as packaging, transportation, and rent.

Same thing with internet. To make internet available in schools, necessary facilities and equipment include, but are not limited to web servers, ethernet cables, and Wi-Fi routers. Large school districts, which are often in urban locations, usually pay much less per student than small districts, which are often in rural areas. Costs of infrastructural adaptations need to be considered when new equipment and facilities are required for high-speed internet delivery. Fiber-optic and satellite internet services have high infrastructural costs. While old-fashioned DSL internet uses existing phone lines and thus has less overhead cost, it's much slower, often making it difficult to meet the current Federal Communications Commission recommended bandwidth of 1 Mbps per student.

In short, there is no one-price-for-all when it comes to costs of school internet access. To tackle this challenge, we used the data available on Connect K-12 for districts in each of the 50 U.S. states to calculate some useful metrics for cost analyses. First, we categorized the districts with internet access according to MDR’s definition of small, medium, and large school districts (Small: 0-2,499 students; Medium: 2,500-9,999 students; Large: 10,000+ students). For each category, we calculated the following metrics which are shown in Table 1:

  1. median cost per student per year
  2. median cost per student per hour

 

Table 1: Internet Access Costs

District size

(# of students)

Median mbps per student per month

Median cost per mbps per month

Median cost per student per month

Cost per student per year

Cost per student per hour

Small (0-2,499)

1.40

$1.75

$2.45

$29.40

$0.02

Medium (2,500-9,999)

0.89

$0.95

$0.85

$10.15

$0.007

Large (10,000+)

0.83

$0.61

$0.50

$6.03

$0.004

National median

1.23

$1.36

$1.67

$20.07

$0.014

 

Note: Cost per student per hour is computed based on the assumption that schools open for 1,440 hours (36 weeks) per annum, e.g., for a small district the cost per student per hour is $29.40/1,440 = $0.02). See methods here.

 

Here’s an example of how you might determine an appropriate portion of the costs to attribute to a specific program or practice:  

Sunnyvale School is in a school district of 4,000 students. It offers an afterschool program in the library in which 25 students work online with remote math tutors. The program runs for 1.5 hours per day on 4 days per week for 36 weeks. Internet costs would be:

 

1.5 hours x 4 days x 36 weeks x 25 students x $0.007 = $37.80.

 

The cost per student per hour might seem tiny. Take New York City Public Schools, for example, the cost per Mbps per month is $0.13, and yet the district pays $26,000 each month for internet. For one education program or intervention, internet costs may sometimes represent only a small fraction of the overall costs and may hardly seem worth estimating in comparison to personnel salaries and fringe benefits. However, it is critical for a rigorous cost analysis study to identify all the resources needed to implement a program.


Yuan Chang is a research assistant in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher on the CAP Project.

 Anna Kushner is a doctoral student in the Department of Education Policy & Social Analysis at Teachers College, Columbia University and a researcher for the CAP Project.

Using Cost Analysis to Inform Replicating or Scaling Education Interventions

A key challenge when conducting cost analysis as part of an efficacy study is producing information that can be useful for addressing questions related to replicability or scale. When the study is a follow up conducted many years after the implementation, the need to collect data retrospectively introduces additional complexities. As part of a recent follow-up efficacy study, Maya Escueta and Tyler Watts of Teachers College, Columbia University worked with the IES-funded Cost Analysis in Practice (CAP) project team to plan a cost analysis that would meet these challenges. This guest blog describes their process and lessons learned and provides resources for other researchers.

What was the intervention for which you estimated costs retrospectively?

We estimated the costs of a pre-kindergarten intervention, the Chicago School Readiness Project (CSRP), which was implemented in nine Head Start Centers in Chicago, Illinois for two cohorts of students in 2004-5 and 2005-6. CSRP was an early childhood intervention that targeted child self-regulation by attempting to overhaul teacher approaches to behavioral management. The intervention placed licensed mental health clinicians in classrooms, and these clinicians worked closely with teachers to reduce stress and improve the classroom climate. CSRP showed signs of initial efficacy on measures of preschool behavioral and cognitive outcomes, but more recent results from the follow-up study showed mainly null effects for the participants in late adolescence.

The IES research centers require a cost study for efficacy projects, so we faced the distinct challenge of conducting a cost analysis for an intervention nearly 20 years after it was implemented. Our goal was to render the cost estimates useful for education decision-makers today to help them consider whether to replicate or scale such an intervention in their own context.

What did you learn during this process?

When enumerating costs and considering how to implement an intervention in another context or at scale, we learned four distinct lessons.

1. Consider how best to scope the analysis to render the findings both credible and relevant given data limitations.

In our case, because we were conducting the analysis 20 years after the intervention was originally implemented, the limited availability of reliable data—a common challenge in retrospective cost analysis—posed two challenges. We had to consider the data we could reasonably obtain and what that would mean for the type of analysis we could credibly conduct. First, because no comprehensive cost analysis was conducted at the time of the intervention’s original implementation (to our knowledge), we could not accurately collect costs on the counterfactual condition. Second, we also lacked reliable measures of key outcomes over time, such as grade retention or special education placement that would be required for calculating a complete cost-benefit analysis. This meant we were limited in both the costs and the effects we could reliably estimate. Due to these data limitations, we could only credibly conduct a cost analysis, rather than a cost-effectiveness analysis or cost-benefit analysis, which generally produce more useful evidence to aid in decisions about replication or scale.

Because of this limitation, and to provide useful information for decision-makers who are considering implementing similar interventions in their current contexts, we decided to develop a likely present-day implementation scenario informed by the historical information we collected from the original implementation. We’ll expand on how we did this and the decisions we made in the following lessons.

2. Consider how to choose prices to improve comparability and to account for availability of ingredients at scale.

We used national average prices for all ingredients in this cost analysis to make the results more comparable to other cost analyses of similar interventions that also use national average prices. This involved some careful thought about how to price ingredients that were unique to the time or context of the original implementation, specific to the intervention, or in low supply. For example, when identifying prices for personnel, we either used current prices (national average salaries plus fringe benefits) for personnel with equivalent professional experience, or we inflation-adjusted the original consulting fees charged by personnel in highly specialized roles. This approach assumes that personnel who are qualified to serve in specialized roles are available on a wider scale, which may not always be the case.

In the original implementation of CSRP, spaces were rented for teacher behavior management workshops, stress reduction workshops, and initial training of the mental health clinicians. For our cost analysis, we assumed that using available school facilities were more likely and tenable when implementing CSRP at large scale. Instead of using rental prices, we valued the physical space needed to implement CSRP by using amortized construction costs of school facilities (for example, cafeteria/gym/classroom). We obtained these from the CAP Project’s Cost of Facilities Calculator.

3. Consider how to account for ingredients that may not be possible to scale.

Some resources are simply not available in similar quality at large scale. For example, the Principal Investigator (PI) for the original evaluation oversaw the implementation of the intervention, was highly invested in the fidelity of implementation, was willing to dedicate significant time, and created a culture that was supportive of the pre-K instructors to encourage buy-in for the intervention. In such cases, it is worth considering what her equivalent role would be in a non-research setting and how scalable this scenario would be. A potential proxy for the PI in this case may be a school principal or leader, but how much time could this person reasonably dedicate, and how similar would their skillset be?  

4. Consider how implementation might work in institutional contexts required for scale.

Institutional settings might necessarily change when taking an intervention to scale. In larger-scale settings, there may be other ways of implementing the intervention that might change the quantities of personnel and other resources required. For example, a pre-K intervention such as CSRP at larger scale may need to be implemented in various types of pre-K sites, such as public schools or community-based centers in addition to Head Start centers. In such cases, the student/teacher ratio may vary across different institutional contexts, which has implications for the per-student cost. If delivered in a manner where the student/ teacher ratio is higher than in the original implementation, the intervention may be less costly, but may also be less impactful. This highlights the importance of the institutional setting in which implementation is occurring, and how this might affect the use and costs of resources.

How can other researchers get assistance in conducting a cost analysis?

In conducting this analysis, we found the following CAP Project tools to be especially helpful (found on the CAP Resources page and the CAP Project homepage):

  • The Cost of Facilities Calculator: A tool that helps estimate the cost of physical spaces (facilities).
  • Cost Analysis Templates: Semi-automated Excel templates that support cost analysis calculations.
  • CAP Project Help Desk: Real-time guidance from a member of the CAP Project team. You will receive help in troubleshooting challenging issues with experts who can share specific resources. Submit a help desk request by visiting this page.

Maya Escueta is a Postdoctoral Associate in the Center for Child and Family Policy at Duke University where she researches the effects of poverty alleviation policies and parenting interventions on the early childhood home environment.

Tyler Watts is an Assistant Professor in the Department of Human Development at Teachers College, Columbia University. His research focuses on the connections between early childhood education and long-term outcomes.

For questions about the CSRP project, please contact the NCER program officer, Corinne.Alfeld@ed.gov. For questions about the CAP project, contact Allen.Ruby@ed.gov.

 

Unexpected Value from Conducting Value-Added Analysis

This is the second of a two-part blog series from an IES-funded partnership project. The first part described how the process of cost-effectiveness analysis (CEA) provided useful information that led to changes in practice for a school nurse program and restorative practices at Jefferson County Public Schools (JCPS) in Louisville, KY. In this guest blog, the team discusses how the process of conducting value-added analysis provided useful program information over and above the information they obtained via CEA or academic return on investment (AROI).

Since we know you loved the last one, it’s time for another fun thought experiment! Imagine that you have just spent more than a year gathering, cleaning, assembling, and analyzing a dataset of school investments for what you hope will be an innovative approach to program evaluation. Now imagine the only thing your results tell you is that your proposed new application of value-added analysis (VAA) is not well-suited for these particular data. What would you do? Well, sit back and enjoy another round of schadenfreude at our expense. Once again, our team of practitioners from JCPS and researchers from Teachers College, Columbia University and American University found itself in a very unenviable position.

We had initially planned to use the rigorous VAA (and CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions on existing school- and district-level investments. Although the three methods—VAA, CEA, and AROI—vary in rigor and address slightly different research questions, we expected that their results would be both complementary and comparable for informing decisions to reinvest, discontinue, expand/contract, or make other implementation changes to an investment. To that end, we set out to test our hypothesis by comparing results from each method across a broad spectrum of investments. Fortunately, as with CEA, the process of conducting VAA provided additional, useful program information that we would not have otherwise obtained via CEA or AROI. This unexpected information, combined with what we’d learned about implementation from our CEAs, led to even more changes in practice at JCPS.

Data Collection for VAA Unearthed Inadequate Record-keeping, Mission Drift, and More

Our AROI approach uses existing student and budget data from JCPS’s online Investment Tracking System (ITS) to compute comparative metrics for informing budget decisions. Budget request proposals submitted by JCPS administrators through ITS include information on target populations, goals, measures, and the budget cycle (1-5 years) needed to achieve the goals. For VAA, we needed similar, but more precise, data to estimate the relative effects of specific interventions on student outcomes, which required us to contact schools and district departments to gather the necessary information. Our colleagues provided us with sufficient data to conduct VAA. However, during this process, we discovered instances of missing or inadequate participant rosters; mission drift in how requested funds were actually spent; and mismatches between goals, activities, and budget cycles. We suspect that JCPS is not alone in this challenge, so we hope that what follows might be helpful to other districts facing similar scenarios.

More Changes in Practice 

The lessons learned during the school nursing and restorative practice CEAs discussed in the first blog, and the data gaps identified through the VAA process, informed two key developments at JCPS. First, we formalized our existing end-of-cycle investment review process by including summary cards for each end-of-cycle investment item (each program or personnel position in which district funds were invested) indicating where insufficient data (for example, incomplete budget requests or unavailable participation rosters) precluded AROI calculations. We asked specific questions about missing data to elicit additional information and to encourage more diligent documentation in future budget requests. 

Second, we created the Investment Tracking System 2.0 (ITS 2.0), which now requires budget requesters to complete a basic logic model. The resources (inputs) and outcomes in the logic model are auto-populated from information entered earlier in the request process, but requesters must manually enter activities and progress monitoring (outputs). Our goal is to encourage and facilitate development of an explicit theory of change at the outset and continuous evidence-based adjustments throughout the implementation. Mandatory entry fields now prevent requesters from submitting incomplete budget requests. The new system was immediately put into action to track all school-level Elementary and Secondary School Emergency Relief (ESSER)-related budget requests.

Process and Partnership, Redux

Although we agree with the IES Director’s insistence that partnerships between researchers and practitioners should be a means to (eventually) improving student outcomes, our experience shows that change happens slowly in a large district. Yet, we have seen substantial changes as a direct result of our partnership. Perhaps the most important change is the drastic increase in the number of programs, investments, and other initiatives that will be evaluable as a result of formalizing the end-of-cycle review process and creating ITS 2.0. We firmly believe these changes could not have happened apart from our partnership and the freedom our funding afforded us to experiment with new approaches to addressing the challenges we face.   


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Robert Shand is an Assistant Professor at American University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.