IES Blog

Institute of Education Sciences

Adult Foundational Skills Research: Reflections on PIAAC and Data on U.S. Adult Skills

In this blog, NCER program officer, Dr. Meredith Larson, interviews Dr. Holly Xie from NCES about the Program for the International Assessment of Adult Competencies (PIAAC), an OECD-developed international survey  of adult skills in literacy, numeracy, and digital problem solving administered at least once a decade. PIAAC also collects information on adult activities (such as skill use at home or work, civic participation, etc.), demographics (such as level of education, race), and other factors (such as health outcomes). To date, NCER has funded three research grants (herehere, and here) and one training grant that relied on PIAAC data.

NCES has led the U.S. efforts in administering PIAAC and has been sharing results for over a decade. PIAAC in Cycle I (PIAAC I) included three waves of data collection in the United States with the first data released in 2013. From PIAAC I, we learned a wealth of information about the skills of U.S. adults. For example, the 2017 wave of data collection found that the percentages of U.S. adults performing at the lowest levels were 19 percent in literacy, 29 percent in numeracy, and 24 percent in digital problem solving. As we look forward to learning from PIAAC II, Dr. Xie reflects on the products from PIAAC I and possibilities for PIAAC II (release in 2024).

What is your role at NCES and with PIAAC specifically?

I am the PIAAC national program manager and oversee all aspects of the PIAAC study in the United States, including development and design, data collection, analysis and reporting, and dissemination/outreach. I also represent the United States at PIAAC international meetings.

What is something you’re particularly excited about having produced during PIAAC I?

I am most excited about the U.S. PIAAC Skills Map. The Skills Map provides information on adult skills at the state and county levels. Users can explore adult skills in literacy and numeracy in their state or county and get estimates of literacy or numeracy proficiency overall and by age and education levels. Or they can compare a county to a state, a state to the nation, or compare counties (or states) to each other. The map also has demographic and socioeconomic data from the American Community Survey (ACS) to provide context for the state or county estimates. This YouTube video demonstrates what the map can do.

 

 

We also have other PIAAC web products and publications such as national and international reports, Data Points, and PIAAC publications that provide invaluable information on U.S. adult skills and interrelationships of those skills to other social, economic, and demographic factors.

Do you have examples of how information from PIAAC I has been used?

PIAAC data cover results at the national, state, and county levels, and as such, they can be useful for policymakers or decision makers who would like to know where things stand in terms of the skills of their adult population and where they need to allocate resources at these different levels of the system. In other words, PIAAC data can be useful for drafting targeted policies and programs that will benefit their population and constituencies.

For instance, at the national level, former Vice President Biden used information from PIAAC I in his report Ready to Work for the June 2014 reauthorization of the Workforce Innovation and Opportunity Act, known as WIOA. PIAAC was also cited in the discussion of extending the Second Chance Pell experiment as identified in the 2019 report titled Prisoners’ Eligibility for Pell Grants: Issues for Congress.

The Digital Equity Act of 2021 also leveraged the PIAAC. This legislation identifies particular populations that determine the funding formula. The quick guide to these populations uses PIAAC to estimate one of these populations: Individuals with a language barrier, including individuals who are English learners and have low levels of literacy.

Local governments have also used PIAAC products. For example, the Houston Mayor’s Office for Adult Literacy in collaboration with the Barbara Bush Foundation used the PIAAC Skills Map data to inform the Adult Literacy Blueprint.

And the adult education advocacy group, ProLiteracy, also used the PIAAC and the Skills Map to develop a toolkit for local program adult education and adult literacy program advocacy.

When will the results of PIAAC II be available, and how does this cycle differ from PIAAC I?

PIAAC II data collection began in 2022 and results will be released in December 2024 and will include information on the literacy, numeracy, and adaptive problem-solving skills of adults in the United States. The numeracy assessment now includes a measure of “numeracy components,” which focus on number sense, smaller/bigger number values, measurement, etc. This information will help us learn more about the skills of adults who have very low numeracy skills. The adaptive problem-solving component is a new PIAAC module and will measure the ability to achieve one’s goals in a dynamic situation in which a method for reaching a solution is not directly available.

PIAAC II will also include, for the first time, questions about financial literacy in the background questionnaire, using items on managing money and tracking spending and income, savings methods, and budgeting. These additional questions will allow people to explore relationships between foundational skills, financial literacy, and other constructs in PIAAC.

What types of research could you imagine stemming from the PIAAC II?

One of the most unique features of PIAAC (both PIAAC I and II) is the direct assessment of literacy, numeracy, and problem-solving skills (information that no other large-scale assessment of adults provides). Thirty-one countries, including the United States, participated in PIAAC II (2022/23), so researchers will be able to compare the adult skills at the international level and also study trends between PIAAC I and PIAAC II.

It’s worth noting that the data collection took place while we were still experiencing the effects of the COVID-19 pandemic. This may provide researchers opportunities to explore how the pandemic is related to adults’ skills, health, employment, training, and education status.

Where can the public access data from PIAAC?

Researchers can find information about the available data from the national U.S. PIAAC 2017 Household, PIAAC 2012/14 Household, and PIAAC 2014 Prison datasets, and international and trend datasets on the NCES Data Files page. PIAAC restricted-use data files contain more detailed information, such as continuous age and earnings variables, that can be used for more in-depth analysis. Accessing the restricted-use data requires a restricted-use license from NCES.

NCES also has an easy-to-use online analysis tool: the International Data Explorer (IDE). The IDE allows users to work directly with the PIAAC data and produce their own analyses, tables, regressions, and charts. An IDE tutorial video provides comprehensive, step-by-step instructions on how to use this tool. It contains detailed information about the content and capabilities of the PIAAC IDE, as well as how the PIAAC data are organized in the tool.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer for postsecondary and adult education, NCER.

Highlights From the FY 21 Revenues and Expenditures for Public Elementary and Secondary Education Report

NCES recently released a finance tables report, Revenues and Expenditures for Public Elementary and Secondary Education: FY 21 (NCES 2023-301), which draws from data in the National Public Education Financial Survey (NPEFS). To accompany the report, NCES has updated the interactive data visualization tool to highlight the per pupil revenues and expenditures (adjusted for inflation) and average daily attendance (ADA) trends from the fiscal year 2021 (FY 21) NPEFS.

This tool allows users to see national or state-specific per pupil amounts and year-to-year percentage changes for both total revenue and current expenditures by using a slider to toggle between the two variables. Total revenues are shown by source, and total current expenditures are shown by function and subfunction. Clicking on a state in the map will display data for the selected state in the bar charts.

The tool also allows users to see the ADA for each state. It is sortable by state, ADA amount, and percentage change. It may also be filtered to easily compare selected states. Hovering over the ADA of a state will display another bar graph with the last 3 years of ADA data.

Overall, the results show that spending1 on elementary and secondary education increased in school year 2020–21 (FY 21). This is the eighth consecutive year that year-over-year education spending increased (since FY 13), after adjusting for inflation. This increase follows declines in year-over-year spending for the prior 4 years (FY 10 through FY 13).

 

Revenues

The 50 states and the District of Columbia reported $837.3 billion in revenues collected for public elementary and secondary education in FY 21. State and local governments provided $748.9 billion, or 89.4 percent of all revenues. The federal government contributed $88.4 billion, or 10.6 percent of all revenues. Total revenues increased by 3.0 percent after adjusting for inflation2 (from $812.8 to $837.3 billion) from FY 20 to FY 21; local revenues remained relatively unchanged (from $365.1 to $365.1 billion); state revenues decreased by 0.6 percent (from $385.9 to $383.8 billion); and federal revenues increased by 43.2 percent (from $61.8 to $88.4 billion).

Total revenues per pupil averaged $17,015 on a national basis in FY 21. This reflects an increase of 5.9 percent between FY 20 and FY 21 and follows an increase of 1.5 percent from FY 19 to FY 20. The percentage change in revenues per pupil from FY 20 to FY 21 ranged from an increase of 15.3 percent in Maine to a decrease of 4.2 percent in Hawaii.


Image of NPEFS data visualization site showing revenues per pupil for public elementary and secondary schools in FY 20 and FY 21


Revenues from COVID-19 Federal Assistance Funds for public elementary and secondary education totaled $25.3 billion, or 28.6 percent of all federal revenues.

  • Revenues from the Federal Coronavirus Relief Fund accounted for $8.9 billion, or 35.2 percent of total revenues from COVID-19 Federal Assistance Funds.
     
  • Revenues from the Elementary and Secondary School Emergency Relief (ESSER I) Fund accounted for $8.5 billion, or 33.7 percent of total revenues from COVID-19 Federal Assistance Funds.
     
  • Revenues from the Elementary and Secondary School Emergency Relief (ESSER II) Fund accounted for $6.5 billion, or 25.8 percent of total revenues from COVID-19 Federal Assistance Funds.

 

Expenditures

Current expenditures for public elementary and secondary education across the nation increased by 0.7 percent between FY 20 and FY 21 (from $698.3 to $703.5 billion). Within that increase, expenditures for instruction increased by 1.1 percent between FY 20 and FY 21 (from $422.4 to $427.1 billion), and student support expenditures increased by 3.6 percent between FY 20 and FY 21 (from $44.0 to $45.6 billion).

Current expenditures per pupil for the day-to-day operation of public elementary and secondary schools was $14,295 in FY 21, an increase of 3.5 percent from FY 20.3 In FY 21, education spending was 16.7 percent higher than at the lowest point of the Great Recession in FY 13.


Figure 1. National inflation-adjusted current expenditures per public for public elementary and secondary education: Fiscal years 2012 through 2021

 

NOTE: Spending is reported in constant FY 21 dollars, based on the Consumer Price Index (CPI). National totals include the 50 states and the District of Columbia. California did not report prekindergarten membership in the State Nonfiscal Survey of Public Elementary/Secondary Education. California reported prekindergarten expenditures separately, and these expenditures were excluded from the amounts reported in this figure.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “National Public Education Financial Survey,” fiscal years 2012 through 2020, Final Version 2a; and fiscal year 2021, Provisional Version 1a; and Digest of Education Statistics 2021, table 106.75. Retrieved March 9, 2023, from nces.ed.gov/programs/digest/d21/tables/dt21_106.75.asp.


Without making adjustments for geographic cost differences, current expenditures per pupil ranged from $9,014 in Utah to $26,097 in New York. In addition to New York, current expenditures per pupil were highest in the District of Columbia ($25,113), Vermont ($24,050), New Jersey ($22,784), and Connecticut ($22,216). In addition to Utah, current expenditures per pupil were lowest in Idaho ($9,054), Arizona ($9,571), Mississippi ($10,060), and Nevada ($10,073). The states with the largest increases in current expenditures per pupil from FY 20 to FY 21 were Maine (11.9 percent), Arizona (7.6 percent), Montana (7.4 percent), Louisiana (7.3 percent), and Massachusetts (6.6 percent).


Image of NPEFS data visualization site showing current expenditures per pupil for public elementary and secondary schools in FY 20 and FY 21


In FY 21, salaries and wages ($389.2 billion) in conjunction with employee benefits ($169.7 billion) accounted for 79.4 percent ($558.8 billion) of current expenditures for public elementary and secondary education. Expenditures for instruction and instructional staff support services comprised 65.8 percent ($462.9 billion) of total current expenditures.

Between FY 20 and FY 21, total expenditures increased by 0.2 percent (from $812.3 to $813.6 billion). Of the $813.6 billion in total expenditures in FY 21, 86.5 percent were current expenditures, 9.8 percent were capital outlay expenditures, 2.7 percent were interest on debt, and 1.1 percent were expenditures for other programs.

Current expenditures from federal Title I grants for economically disadvantaged students (including carryover expenditures) accounted for $16.3 billion, or 2.3 percent of current expenditures for public elementary and secondary education at the national level in FY 21. Nationally, Title I expenditures per pupil averaged $331 and ranged from $123 in Utah to $874 in New York.

Current expenditures paid from COVID-19 Federal Assistance Funds for public elementary and secondary education totaled $24.2 billion for the 50 states and the District of Columbia. Of these, instructional expenditures accounted for $13.7 billion, or 56.5 percent of current expenditures paid from COVID-19 Federal Assistance Funds, and support services expenditures accounted for $9.1 billion, or 37.6 percent of current expenditures paid from COVID-19 Federal Assistance Funds.

To explore data on public elementary and secondary revenues, expenditures, and ADA, check out our new data visualization tool.

Be sure to follow NCES on TwitterFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay up-to-date on the latest from the National Public Education Financial Survey.

 

By Stephen Q. Cornman, NCES, and Malia Howell and Jeremy Phillips, U.S. Census Bureau

 


[1] Spending refers to current expenditures. Current expenditures are composed of expenditures for the day-to-day operation of schools and school districts for public elementary and secondary education, including expenditures for staff salaries and benefits, supplies, and purchased services. Current expenditures include instruction, instruction-related, support services (e.g., social work, health, and psychological services), and other elementary/secondary current expenditures but exclude expenditures on capital outlay, other programs, and interest on long-term debt.

[2] Throughout this blog post, all comparisons between years are adjusted for inflation by converting the figures to constant dollars. Inflation adjustments utilize the Consumer Price Index (CPI) published by the U.S. Department of Labor, Bureau of Labor Statistics. For comparability to fiscal education data, NCES adjusts the CPI from a calendar year to a school fiscal year basis (July through June). See Digest of Education Statistics 2021, table 106.70.

[3] Per pupil expenditures are calculated using student membership derived from the State Nonfiscal Survey of Public Elementary/Secondary Education. In some states, adjustments are made to ensure consistency between membership and reported fiscal data. More information on these adjustments can be found in the data file documentation.

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Have a Cost Analysis to Plan or Execute? We Have a Module for That

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team.

Analyzing an intervention’s costs is one of IES’s nine SEER principles. Cost analysis is not just about the dollar value of an intervention; it provides key information to education decision-makers about the personnel, materials, facilities, and other inputs needed to implement an intervention or policy with fidelity. But planning and executing any kind of economic evaluation, such as a cost analysis or cost-effectiveness analysis, involves many steps.

The IES-funded Cost Analysis in Practice Project (CAP Project) has developed a series of five free, online modules on cost analysis. Each module includes a sequence of short videos (3‑17 minutes each) and resources to facilitate each of the 4 main stages of a cost analysis: study design, data collection, data analysis, and reporting (register here for the CAP Project online modules).

The modules are timely for anyone submitting a grant application to the IES FY 2024 grant programs that require a cost analysis. In addition, cost studies are included in the Education Innovation and Research (EIR) Mid-phase or Expansion grants. For your grant application, you’ll likely only need parts of Modules 1 and 2, Introduction to Cost Analysis and Designing a Cost Analysis. You can save the rest for when you receive a grant.

You should review the IES Request for Applications (RFA) to determine what kind of economic evaluation, if any, is required for your IES application. You can also review the CAP Project’s RFA requirements chart, which summarizes our take on what is required and what is recommended for each IES RFA. If your grant application does not require a cost analysis but you want to include one, we created a flowchart to help you decide which type of evaluation might make sense for your situation: see Module 1 Video 2b. We also provide a brief example of each kind of economic evaluation in Module 1 Video 3. 

If cost analysis is new to you, Module 1 Video 1 explains what “costs” really are. Module 1 Video 2a introduces the ingredients method and a demonstration of why it’s important to differentiate between economic costs and expenditures. Module 1 Video 4 walks you through the four stages of a cost analysis and points out when to use specific CAP Project resources such as our Checklist for Cost Analysis Plans, Timeline of Activities for Cost Analysis, and Cost Analysis Templates (the “CAPCATs”). If you prefer reading to watching videos, our Cost Analysis Standards & Guidelines cover this ground in more depth.

When you’re ready to plan your cost or cost-effectiveness analysis, head to Module 2. The introductory video (Module 2 Video 1) discusses a few critical decisions you need to make early on that will affect how much of your study budget should be dedicated to the economic evaluation—no one likes surprises there. Module 2 Videos 2 and 3 walk you through the design of an economic evaluation, illustrating each design feature using Reading Recovery as an example. Module 2 Video 4 presents a few scenarios to help you think about which costs you will estimate and how the costs of the intervention you plan to study compare to the costs of business as usual. Module 2 Video 5 reviews a timeline and key activities for each stage of your economic evaluation. The content in Modules 1 and 2 should help you develop a robust plan for an economic evaluation so that you’ll be all set to begin the study as soon as you are funded.

Modules 3-5 cover data collection, analysis, and reporting. You may want to skim these now, or at least watch the brief introductory videos for an overview of what’s in store for you and your cost analyst. These modules can help you execute your cost study.


Fiona Hollands is the Founder & Managing Director of EdResearcher. She studies the effectiveness and costs of educational programs with the goal of helping education practitioners and policymakers optimize the use of resources in education to promote better student outcomes.

Jaunelle Pratt-Williams is a Senior Research Scientist at NORC at the University of Chicago. She leads economic evaluations and mixed-methods policy research studies to improve the educational opportunities for historically underserved students.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.