IES Blog

Institute of Education Sciences

Using Cost Analysis to Inform Replicating or Scaling Education Interventions

A key challenge when conducting cost analysis as part of an efficacy study is producing information that can be useful for addressing questions related to replicability or scale. When the study is a follow up conducted many years after the implementation, the need to collect data retrospectively introduces additional complexities. As part of a recent follow-up efficacy study, Maya Escueta and Tyler Watts of Teachers College, Columbia University worked with the IES-funded Cost Analysis in Practice (CAP) project team to plan a cost analysis that would meet these challenges. This guest blog describes their process and lessons learned and provides resources for other researchers.

What was the intervention for which you estimated costs retrospectively?

We estimated the costs of a pre-kindergarten intervention, the Chicago School Readiness Project (CSRP), which was implemented in nine Head Start Centers in Chicago, Illinois for two cohorts of students in 2004-5 and 2005-6. CSRP was an early childhood intervention that targeted child self-regulation by attempting to overhaul teacher approaches to behavioral management. The intervention placed licensed mental health clinicians in classrooms, and these clinicians worked closely with teachers to reduce stress and improve the classroom climate. CSRP showed signs of initial efficacy on measures of preschool behavioral and cognitive outcomes, but more recent results from the follow-up study showed mainly null effects for the participants in late adolescence.

The IES research centers require a cost study for efficacy projects, so we faced the distinct challenge of conducting a cost analysis for an intervention nearly 20 years after it was implemented. Our goal was to render the cost estimates useful for education decision-makers today to help them consider whether to replicate or scale such an intervention in their own context.

What did you learn during this process?

When enumerating costs and considering how to implement an intervention in another context or at scale, we learned four distinct lessons.

1. Consider how best to scope the analysis to render the findings both credible and relevant given data limitations.

In our case, because we were conducting the analysis 20 years after the intervention was originally implemented, the limited availability of reliable data—a common challenge in retrospective cost analysis—posed two challenges. We had to consider the data we could reasonably obtain and what that would mean for the type of analysis we could credibly conduct. First, because no comprehensive cost analysis was conducted at the time of the intervention’s original implementation (to our knowledge), we could not accurately collect costs on the counterfactual condition. Second, we also lacked reliable measures of key outcomes over time, such as grade retention or special education placement that would be required for calculating a complete cost-benefit analysis. This meant we were limited in both the costs and the effects we could reliably estimate. Due to these data limitations, we could only credibly conduct a cost analysis, rather than a cost-effectiveness analysis or cost-benefit analysis, which generally produce more useful evidence to aid in decisions about replication or scale.

Because of this limitation, and to provide useful information for decision-makers who are considering implementing similar interventions in their current contexts, we decided to develop a likely present-day implementation scenario informed by the historical information we collected from the original implementation. We’ll expand on how we did this and the decisions we made in the following lessons.

2. Consider how to choose prices to improve comparability and to account for availability of ingredients at scale.

We used national average prices for all ingredients in this cost analysis to make the results more comparable to other cost analyses of similar interventions that also use national average prices. This involved some careful thought about how to price ingredients that were unique to the time or context of the original implementation, specific to the intervention, or in low supply. For example, when identifying prices for personnel, we either used current prices (national average salaries plus fringe benefits) for personnel with equivalent professional experience, or we inflation-adjusted the original consulting fees charged by personnel in highly specialized roles. This approach assumes that personnel who are qualified to serve in specialized roles are available on a wider scale, which may not always be the case.

In the original implementation of CSRP, spaces were rented for teacher behavior management workshops, stress reduction workshops, and initial training of the mental health clinicians. For our cost analysis, we assumed that using available school facilities were more likely and tenable when implementing CSRP at large scale. Instead of using rental prices, we valued the physical space needed to implement CSRP by using amortized construction costs of school facilities (for example, cafeteria/gym/classroom). We obtained these from the CAP Project’s Cost of Facilities Calculator.

3. Consider how to account for ingredients that may not be possible to scale.

Some resources are simply not available in similar quality at large scale. For example, the Principal Investigator (PI) for the original evaluation oversaw the implementation of the intervention, was highly invested in the fidelity of implementation, was willing to dedicate significant time, and created a culture that was supportive of the pre-K instructors to encourage buy-in for the intervention. In such cases, it is worth considering what her equivalent role would be in a non-research setting and how scalable this scenario would be. A potential proxy for the PI in this case may be a school principal or leader, but how much time could this person reasonably dedicate, and how similar would their skillset be?  

4. Consider how implementation might work in institutional contexts required for scale.

Institutional settings might necessarily change when taking an intervention to scale. In larger-scale settings, there may be other ways of implementing the intervention that might change the quantities of personnel and other resources required. For example, a pre-K intervention such as CSRP at larger scale may need to be implemented in various types of pre-K sites, such as public schools or community-based centers in addition to Head Start centers. In such cases, the student/teacher ratio may vary across different institutional contexts, which has implications for the per-student cost. If delivered in a manner where the student/ teacher ratio is higher than in the original implementation, the intervention may be less costly, but may also be less impactful. This highlights the importance of the institutional setting in which implementation is occurring, and how this might affect the use and costs of resources.

How can other researchers get assistance in conducting a cost analysis?

In conducting this analysis, we found the following CAP Project tools to be especially helpful (found on the CAP Resources page and the CAP Project homepage):

  • The Cost of Facilities Calculator: A tool that helps estimate the cost of physical spaces (facilities).
  • Cost Analysis Templates: Semi-automated Excel templates that support cost analysis calculations.
  • CAP Project Help Desk: Real-time guidance from a member of the CAP Project team. You will receive help in troubleshooting challenging issues with experts who can share specific resources. Submit a help desk request by visiting this page.

Maya Escueta is a Postdoctoral Associate in the Center for Child and Family Policy at Duke University where she researches the effects of poverty alleviation policies and parenting interventions on the early childhood home environment.

Tyler Watts is an Assistant Professor in the Department of Human Development at Teachers College, Columbia University. His research focuses on the connections between early childhood education and long-term outcomes.

For questions about the CSRP project, please contact the NCER program officer, Corinne.Alfeld@ed.gov. For questions about the CAP project, contact Allen.Ruby@ed.gov.

 

Promoting Equitable and Sustainable Behavioral Interventions in Early Childhood

The Postdoctoral Research Training Program in Special Education and Early Intervention is designed to prepare scientists to conduct rigorous, practice-relevant research to advance the fields of special education and early intervention. Dr. Jun Ai recently completed an IES postdoctoral fellowship at the University of Kansas and is currently an assistant research professor at the University of Northern Iowa. Her research focuses on the implementation of early childhood behavioral interventions, particularly for young learners with disabilities and those from minoritized communities. We recently caught up with Dr. Ai to learn more about her career, the experiences that have shaped it, and how her work addresses equity and inclusion in early intervention. This is what she shared with us.

How did you begin your career journey as an education researcher?

My research focuses on the equitable and sustainable implementation of early childhood positive behavioral interventions and supports (EC-PBIS) to promote the social-emotional and behavioral health of all children, especially those with disabilities and/or from minoritized groups. Before starting my PhD program, I was a special education teacher working with students with autism spectrum disorders in China. That’s when I learned about applied behavioral science and PBIS. I decided to become a board-certified behavior analyst (BCBA) during my doctoral studies at the University of Kansas. Through my BCBA practicum, I worked with young children with disabilities and challenging behaviors in self-contained settings.

Meanwhile, I was also supervising pre-service teachers and behavioral analysts working in inclusive early care and education settings where behavior issues were addressed through multi-tiered EC-PBIS. These experiences deepened my interest in EC-PBIS and led me to research how to prepare professionals to use multi-tiered EC-PBIS to promote foundational social-emotional competence and prevent challenging behaviors for all children, regardless of their abilities or forms of diversity. Most importantly, I study how equitable and sustainable implementation of EC-PBIS can reduce racial disciplinary disparities to eventually eliminate suspension and expulsion in early care and education. Through my dissertation and NCSER-funded postdoctoral fellowship at Juniper Gardens Children’s Project at the University of Kansas, I led multiple independent research projects in these areas. With the support from my mentors, Judith Carta, Kathryn Bigelow, and Jay Buzhardt, I also had the opportunity to work on several NCSER-funded projects that address issues in EC-PBIS and the implementation of evidence-based practices.

What is the most rewarding part of your research?

Currently, I serve on the Iowa state leadership team of EC-PBIS and continue to expand my scholarship on EC-PBIS implementation through my research and teaching capacities. The most rewarding part of my work has been gaining expertise in a variety of research methodologies, especially mixed-methods research. Mixed-methods research allows me to carry out rigorous quantitative intervention and test hypotheses while also hearing the voices of participants and various stakeholders using trustworthy qualitative methodology, with data from each method informing the other. As a result, I can tackle complex issues related to implementing interventions in real-world settings and improve the design of interventions.

In your area of research, what do you see as the most critical areas of need to address diversity and equity and improve the relevance of education research for diverse communities of students and families?

One of the greatest needs is around diversifying the researcher leadership workforce. Higher education institutions need to prioritize recruitment, retention, and tailored support for educational researchers from historically and currently marginalized groups based on their race, ethnicity, language, sexual orientation, disabilities, and more.

Equally important is the need to increase funding resources for minority researchers whose scholarship aims to dismantle systemic racism and racial inequities in our educational systems. Researchers of color need more seats at the table to disturb the power imbalance within the research community, advocate for students and families in their own communities, and improve the relevance of education research for diverse groups.

Last but not least, the education research community at large needs to question the status quo of how to conduct research for, with, and by diverse communities.

What advice would you give to emerging scholars from underrepresented, minoritized groups that are pursuing a career in education research?

Find the research topic that gives you goosebumps. It might be hard at the beginning when research interests are highly directed by the existing research agenda of advisors or funding sources. But don’t let that feeling of butterflies go. Try to start small. It might mean stepping out of your normal circle to find mentors, allies, or funding agencies that are also excited about your mission and your research interests.

Remember that you need to be so good that nobody can ignore you. Researchers of color, especially minoritized early career scholars, still need to work multiple times harder to be seen and heard. Unfortunately, this will still be true in the foreseeable future. Find and join minority education researcher communities through professional organizations or organize your own. You are not in this alone.

While continuing to hone your craft, speak up for yourself and your community when you can. Recognize your own burdens and privileges and stand with the most oppressed. Learn about and practice how to have a voice at the table even though your culture or your lived experience told you otherwise. The work you care about can change students' and families’ lives. Your work matters. Your voice matters.

This year, Inside IES Research is publishing a series of interviews (see herehere, and here) showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice.

This blog was produced by Bennett Lunn (Bennett.Lunn@ed.gov), Truman-Albright Fellow, and Katie Taylor (Katherine.Taylor@ed.gov), postdoctoral training program officer at the National Center for Special Education Research.

From Disproportionate Discipline to Thriving Students: An IES Postdoc’s Mission

This year, Inside IES Research is publishing a series of blogs showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice. This week, Dr. Courtney Zulauf-McCurdy, an IES postdoctoral fellow at the University of Washington School Mental Health Assessment Research and Training (SMART) Center, shares her experiences and discusses her path forward.

 

My interests in child development began early on. I moved frequently for my parents’ work, so I was often seen as an outsider by the other children at the schools I attended. One school in particular had a group of “popular students” who bullied others and were particularly aggressive to peers. Often, teachers and parents would turn a blind eye to this behavior, and I became curious about how parents and educators respond to and shape child behavior.

Understanding Disparities in Early Childhood

I pursued a PhD in clinical psychology at the University of Illinois at Chicago out of a desire to advocate for children in both research and clinical practice. As a graduate student in the Social Emotional Teaching and Learning (SETL) Lab, I worked directly with parents, educators, and young children to understand how the school and home environment shape child behavior. Much of our research aimed to support teachers in improving children’s social-emotional development, but what I learned was that teachers weren’t providing equal opportunities and experiences to all children.

In particular, I became focused on an alarming disparity: disproportionate discipline. Not only are preschoolers being expelled at rates three times higher than students in K-12, but there are large discipline disparities by gender and race. In AY 2013-14, the U.S. Department of Education reported that Black children composed 19% of enrollment but 47% of those expelled. A report citing data from the 2016 U.S. Census Bureau found that children with social emotional difficulties are 14.5 times more likely to be expelled.

During graduate school, I explored the reasons why Black boys are being disproportionately expelled and found that it was at least in part related to teachers’ biased perceptions of parents. Because of this, I became interested in developing evidenced-based interventions for parents and educators to protect children from being expelled.

For my clinical internship, I specialized in integrated behavioral health at the Children’s Hospital of Philadelphia, where I provided evidenced-based practices to children and families in underserved community settings. Here, I learned about behavioral interventions that improve child behavior, which work best when parents and teachers work together across home and school. However, I noticed that children of color were less likely to receive evidenced-based interventions (such as classroom-based behavioral interventions or parent management training), and even when they do, parents and teachers experience barriers to working together to implement these interventions. As a result, I shifted my focus from designing new interventions to understanding how to improve the implementation of interventions in community settings that serve young children from under-represented backgrounds.

Moving from Intervention Development to Implementation Science

As a second year IES postdoctoral fellow at the University of Washington (UW) SMART Center, I am combining my research interests with implementation science. I am partnering with educators and parents to understand how teacher perceptions of parents and parent engagement is an implementation determinant—that is, a barrier or facilitator. Together, we are learning how to reduce disparities in preschool by improving the implementation of interventions that allow for early, easy, and acceptable access to families who face the highest levels of barriers. 

I have been using stakeholder-engaged processes consisting of focus groups, community advisory boards, and rapid try outs of strategies to ensure equity by engaging the perspectives of families from under-represented minority backgrounds. Such community engagement aims to ensure that our interventions are culturally responsive and unimpeded by bias.

Through my work, I have learned that educators and parents want the best outcomes for their children but face a multitude of barriers that hinder their ability to engage. For example, preschool teachers have limited resources, face stress and burnout, are under-prepared and underpaid, leading to considerable barriers in addressing the mental health needs of young children. Likewise, parents face obstacles such as perceived bias from their child’s school and logistical barriers such as time and childcare.

Moving Forward

I will continue working directly with parents and educators to understand how we can place all young children (and their families) in the best position to thrive. I will continue to use research methods, such as community advisory boards and qualitative methods, that seek to elevate the voices of parents and educators to promote equitable child outcomes. Through continued collaboration with community partners, disseminating my findings to parents, educators, and practitioners and connecting research with culturally responsive early childhood practice and policies, I hope to dismantle disparities in preschool outcomes.


Produced by Meredith Larson (Meredith.Larson@ed.gov), a program officer for IES Postdoctoral Training grants, and Bennett Lunn (Bennett.Lunn@ed.gov), Truman-Albright Fellow for the National Center for Education Research and the National Center for Special Education Research.

NCES Releases New Edition of the Digest of Education Statistics

NCES recently released the 2020 edition of the Digest of Education Statistics, the 56th in a series of publications initiated in 1962. The Digest—which provides a centralized location for a wide range of statistical information covering early childhood through adult education—tells the story of American education through data. Digest tables are the foundation of many NCES reports, including the congressionally mandated Condition of Education, which contains key indicators that describe and visualize important developments and trends.

The Digest includes data tables from many sources, both government and private, and draws especially on the results of surveys and activities carried out by NCES. In addition, the Digest serves as one of the only NCES reports where data from across nearly 200 sources—including other statistical agencies like the Bureau of Labor Statistics and the Census Bureau—are compiled. The publication contains data on a variety of subjects in the field of education statistics, including the number of schools and colleges, teachers, enrollments, and graduates, in addition to data on educational attainment, finances, federal funds for education, libraries, and international comparisons. A helpful feature of the Digest is its ability to provide long-term trend data. Several tables include data that were collected more than 50—or even 100—years ago:

  • Poverty status of all persons, persons in families, and related children under age 18, by race/ethnicity: Selected years, 1960 through 2019 (table 102.50)
  • Percentage of the population 3 to 34 years old enrolled in school, by age group: Selected years, 1940 through 2019 (table 103.20)
  • Rates of high school completion and bachelor's degree attainment among persons age 25 and over, by race/ethnicity and sex: Selected years, 1910 through 2020 (table 104.10)
  • Historical summary of faculty, enrollment, degrees conferred, and finances in degree-granting postsecondary institutions: Selected years, 1869-70 through 2018-19 (table 301.20)
  • Federal support and estimated federal tax expenditures for education, by category: Selected fiscal years, 1965 through 2019 (table 401.10)

The Digest is organized into seven chapters: All Levels of Education, Elementary and Secondary Education, Postsecondary Education, Federal Funds for Education and Related Activities, Outcomes of Education, International Comparisons of Education, and Libraries and Use of Technology. Each chapter is divided into a number of topical subsections. The Digest also includes a Guide to Sources and a Definitions section to provide supplemental information to readers. To learn more about how the Digest is structured and how best to navigate it—including how to access the most current tables or tables from a specific year and how to search for key terms—check out the blog post “Tips for Navigating the Digest of Education Statistics.”

In addition to providing updated versions of many statistics that have appeared in previous years, this edition also includes several new tables, many of which highlight data related to the coronavirus pandemic:

  • Percentage of adults with children in the household who reported their child’s classes were moved to a distance learning format using online resources in selected periods during April through December 2020, by selected adult and household characteristics (table 218.80)
  • Percentage of adults with children in the household who reported that computers and internet access were always or usually available for educational purposes in their household in selected periods during April through December 2020, by selected adult and household characteristics (table 218.85)
  • Percentage of adults with children in the household who reported that computers or digital devices and internet access were provided by their child’s schools or districts in selected periods during April through December 2020, by selected adult and household characteristics (table 218.90)
  • Number of school shootings at public and private elementary and secondary schools between 2000-01 and 2019-20, by location and time period (table 228.14)
  • Percentage of adults who reported changes to household members’ fall postsecondary plans in August 2020, by level of postsecondary education planned and selected respondent characteristics (table 302.80)
  • Percentage of adults with at least one household member’s fall attendance plans cancelled who reported on reasons for changes in plans in August 2020, by level of postsecondary education planned and selected respondent characteristics (table 302.85)

Also new this year is the release of more than 200 machine-readable Digest tables, with more to come at a later date. These tables allow the data to be read in a standard format, making them easier for developers and researchers to use. To learn more about machine-readable tables, check out the blog post “Machine-Readable Tables for the Digest of Education Statistics.

Learn more about the Digest in the Foreword to the publication and explore the tables in this edition.

 

By Megan Barnett, AIR

Research Roundup: NCES Celebrates Native American Heritage Month

Looking at data by race and ethnicity can provide a better understanding of education performance and outcomes than examining statistics that describe all students. In observation of Native American Heritage Month, this blog presents NCES findings on the learning experiences of American Indian/Alaska Native (AI/AN) students throughout their education careers.

Early Childhood Education

  • In 2019, 45 percent of AI/AN 3- to 4-year-olds and 83 percent of AI/AN 5-year-olds were enrolled in school.
     

K12 Education

  • The 2019 National Indian Education Study (NIES) surveyed students, teachers, and school principals about the experiences of AI/AN students in 4th and 8th grades.
     
    • How much do AI/AN students know about their culture?
      • Most 4th-grade AI/AN students reported having at least “a little” knowledge of their AI/AN tribe or group, with 17 percent reporting knowing “nothing.” About 19 to 23 percent reported having “a lot” of cultural knowledge across school types. (For more information, see NIES 2019, p. 11.)
         
    • Where do AI/AN students learn about their culture?
      • Family members were identified as the people who taught students the most about AI/AN history, with 45 percent of 4th-grade students and 60 percent of 8th-grade students so reporting. Teachers were the second most commonly identified group of people important for educating students on AI/AN cultural topics. (For more information, see NIES 2019, p. 12.)
         
    • How do teachers contribute to AI/AN student cultural knowledge?
      • A majority of AI/AN students had teachers who integrated AI/AN culture or history into reading lessons: overall, 89 percent of 4th-grade students and 76 percent of 8th-grade students had teachers who reported using these concepts in reading lessons “at least once a year.” (For more information, see NIES 2019, p. 16.)
         
    • What are AI/AN student trends on assessments in mathematics and reading?
      • Nationally, mathematics scores for AI/AN students from 2015 to 2019 remained unchanged for 4th-graders and declined for 8th-graders. Most states saw no change. (For more information, see NIES 2019, p. 46.)
         
  • In 2019, 52 percent of AI/AN 4th-grade students had access to a computer at home. (For more information, see NIES 2019, p. 45.)
     
  • There were 505,000 AI/AN students enrolled in public schools in 1995, compared with 490,000 AI/AN students in fall 2018 (the last year of data available).
     
  • In fall 2018, less than half of AI/AN students (40 percent) attended schools where minority students comprised at least 75 percent of the student population.
     
  • There are approximately 45,000 American Indian/Alaska Native students served by approximately 180 Bureau of Indian Education (BIE) schools located on 64 reservations in 23 states.
     
  • In school year 2018–19, the adjusted cohort graduation rate (ACGR) was 74 percent for AI/AN public school students. The ACGRs for AI/AN students ranged from 51 percent in Minnesota to 94 percent in Alabama and were higher than the U.S. average in eight states (Texas, Virginia, Louisiana, Tennessee, Connecticut, New Jersey, Alabama, and Kentucky).
     
  • In 2020, 95 percent of 25- to 29-year-olds who were AI/AN had completed at least high school.

 

Postsecondary Education

  • In academic year 2018–19, 14 percent of bachelor’s degrees conferred to AI/AN graduates were in a STEM field.
     
  • About 41 percent of AI/AN students who began seeking a bachelor’s degree full-time at a 4-year institution in fall 2013 completed that degree at the same institution within 6 years.

 

 

By Mandy Dean, AIR