IES Blog

Institute of Education Sciences

Unexpected Value from Conducting Value-Added Analysis

This is the second of a two-part blog series from an IES-funded partnership project. The first part described how the process of cost-effectiveness analysis (CEA) provided useful information that led to changes in practice for a school nurse program and restorative practices at Jefferson County Public Schools (JCPS) in Louisville, KY. In this guest blog, the team discusses how the process of conducting value-added analysis provided useful program information over and above the information they obtained via CEA or academic return on investment (AROI).

Since we know you loved the last one, it’s time for another fun thought experiment! Imagine that you have just spent more than a year gathering, cleaning, assembling, and analyzing a dataset of school investments for what you hope will be an innovative approach to program evaluation. Now imagine the only thing your results tell you is that your proposed new application of value-added analysis (VAA) is not well-suited for these particular data. What would you do? Well, sit back and enjoy another round of schadenfreude at our expense. Once again, our team of practitioners from JCPS and researchers from Teachers College, Columbia University and American University found itself in a very unenviable position.

We had initially planned to use the rigorous VAA (and CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions on existing school- and district-level investments. Although the three methods—VAA, CEA, and AROI—vary in rigor and address slightly different research questions, we expected that their results would be both complementary and comparable for informing decisions to reinvest, discontinue, expand/contract, or make other implementation changes to an investment. To that end, we set out to test our hypothesis by comparing results from each method across a broad spectrum of investments. Fortunately, as with CEA, the process of conducting VAA provided additional, useful program information that we would not have otherwise obtained via CEA or AROI. This unexpected information, combined with what we’d learned about implementation from our CEAs, led to even more changes in practice at JCPS.

Data Collection for VAA Unearthed Inadequate Record-keeping, Mission Drift, and More

Our AROI approach uses existing student and budget data from JCPS’s online Investment Tracking System (ITS) to compute comparative metrics for informing budget decisions. Budget request proposals submitted by JCPS administrators through ITS include information on target populations, goals, measures, and the budget cycle (1-5 years) needed to achieve the goals. For VAA, we needed similar, but more precise, data to estimate the relative effects of specific interventions on student outcomes, which required us to contact schools and district departments to gather the necessary information. Our colleagues provided us with sufficient data to conduct VAA. However, during this process, we discovered instances of missing or inadequate participant rosters; mission drift in how requested funds were actually spent; and mismatches between goals, activities, and budget cycles. We suspect that JCPS is not alone in this challenge, so we hope that what follows might be helpful to other districts facing similar scenarios.

More Changes in Practice 

The lessons learned during the school nursing and restorative practice CEAs discussed in the first blog, and the data gaps identified through the VAA process, informed two key developments at JCPS. First, we formalized our existing end-of-cycle investment review process by including summary cards for each end-of-cycle investment item (each program or personnel position in which district funds were invested) indicating where insufficient data (for example, incomplete budget requests or unavailable participation rosters) precluded AROI calculations. We asked specific questions about missing data to elicit additional information and to encourage more diligent documentation in future budget requests. 

Second, we created the Investment Tracking System 2.0 (ITS 2.0), which now requires budget requesters to complete a basic logic model. The resources (inputs) and outcomes in the logic model are auto-populated from information entered earlier in the request process, but requesters must manually enter activities and progress monitoring (outputs). Our goal is to encourage and facilitate development of an explicit theory of change at the outset and continuous evidence-based adjustments throughout the implementation. Mandatory entry fields now prevent requesters from submitting incomplete budget requests. The new system was immediately put into action to track all school-level Elementary and Secondary School Emergency Relief (ESSER)-related budget requests.

Process and Partnership, Redux

Although we agree with the IES Director’s insistence that partnerships between researchers and practitioners should be a means to (eventually) improving student outcomes, our experience shows that change happens slowly in a large district. Yet, we have seen substantial changes as a direct result of our partnership. Perhaps the most important change is the drastic increase in the number of programs, investments, and other initiatives that will be evaluable as a result of formalizing the end-of-cycle review process and creating ITS 2.0. We firmly believe these changes could not have happened apart from our partnership and the freedom our funding afforded us to experiment with new approaches to addressing the challenges we face.   


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Robert Shand is an Assistant Professor at American University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

 

NCES Releases New Edition of the Digest of Education Statistics

NCES recently released the 2020 edition of the Digest of Education Statistics, the 56th in a series of publications initiated in 1962. The Digest—which provides a centralized location for a wide range of statistical information covering early childhood through adult education—tells the story of American education through data. Digest tables are the foundation of many NCES reports, including the congressionally mandated Condition of Education, which contains key indicators that describe and visualize important developments and trends.

The Digest includes data tables from many sources, both government and private, and draws especially on the results of surveys and activities carried out by NCES. In addition, the Digest serves as one of the only NCES reports where data from across nearly 200 sources—including other statistical agencies like the Bureau of Labor Statistics and the Census Bureau—are compiled. The publication contains data on a variety of subjects in the field of education statistics, including the number of schools and colleges, teachers, enrollments, and graduates, in addition to data on educational attainment, finances, federal funds for education, libraries, and international comparisons. A helpful feature of the Digest is its ability to provide long-term trend data. Several tables include data that were collected more than 50—or even 100—years ago:

  • Poverty status of all persons, persons in families, and related children under age 18, by race/ethnicity: Selected years, 1960 through 2019 (table 102.50)
  • Percentage of the population 3 to 34 years old enrolled in school, by age group: Selected years, 1940 through 2019 (table 103.20)
  • Rates of high school completion and bachelor's degree attainment among persons age 25 and over, by race/ethnicity and sex: Selected years, 1910 through 2020 (table 104.10)
  • Historical summary of faculty, enrollment, degrees conferred, and finances in degree-granting postsecondary institutions: Selected years, 1869-70 through 2018-19 (table 301.20)
  • Federal support and estimated federal tax expenditures for education, by category: Selected fiscal years, 1965 through 2019 (table 401.10)

The Digest is organized into seven chapters: All Levels of Education, Elementary and Secondary Education, Postsecondary Education, Federal Funds for Education and Related Activities, Outcomes of Education, International Comparisons of Education, and Libraries and Use of Technology. Each chapter is divided into a number of topical subsections. The Digest also includes a Guide to Sources and a Definitions section to provide supplemental information to readers. To learn more about how the Digest is structured and how best to navigate it—including how to access the most current tables or tables from a specific year and how to search for key terms—check out the blog post “Tips for Navigating the Digest of Education Statistics.”

In addition to providing updated versions of many statistics that have appeared in previous years, this edition also includes several new tables, many of which highlight data related to the coronavirus pandemic:

  • Percentage of adults with children in the household who reported their child’s classes were moved to a distance learning format using online resources in selected periods during April through December 2020, by selected adult and household characteristics (table 218.80)
  • Percentage of adults with children in the household who reported that computers and internet access were always or usually available for educational purposes in their household in selected periods during April through December 2020, by selected adult and household characteristics (table 218.85)
  • Percentage of adults with children in the household who reported that computers or digital devices and internet access were provided by their child’s schools or districts in selected periods during April through December 2020, by selected adult and household characteristics (table 218.90)
  • Number of school shootings at public and private elementary and secondary schools between 2000-01 and 2019-20, by location and time period (table 228.14)
  • Percentage of adults who reported changes to household members’ fall postsecondary plans in August 2020, by level of postsecondary education planned and selected respondent characteristics (table 302.80)
  • Percentage of adults with at least one household member’s fall attendance plans cancelled who reported on reasons for changes in plans in August 2020, by level of postsecondary education planned and selected respondent characteristics (table 302.85)

Also new this year is the release of more than 200 machine-readable Digest tables, with more to come at a later date. These tables allow the data to be read in a standard format, making them easier for developers and researchers to use. To learn more about machine-readable tables, check out the blog post “Machine-Readable Tables for the Digest of Education Statistics.

Learn more about the Digest in the Foreword to the publication and explore the tables in this edition.

 

By Megan Barnett, AIR

Research Roundup: NCES Celebrates Black History Month

Looking at data by race and ethnicity can provide a better understanding of education performance and outcomes than examining statistics that describe all students. In observation of Black History Month, this blog presents NCES findings on the learning experiences of Black students throughout their education careers as well as the characteristics of Black teachers and faculty.

K–12 Education

  • Students
    • Of the 49.4 million students enrolled in public preK–12 schools in fall 2020, 7.4 million were Black students. 


       
    • In 2019–2020, some 9 percent of private school students were Black non-Hispanic.
       
    • In 2019, some 51 percent of Black 8th-grade students were in a school that reported offering a programming class. Eighty-four percent of Black 8th-grade students were in a school that offered algebra classes that were equivalent to high school algebra classes.
       
  • Teachers
    • In 2017–18, about 7 percent of all public school teachers self-identified as Black, compared with 3 percent of all private school teachers.
       
    • Twelve percent of all female career or technical education (CTE) public school teachers were Black women in 2017–18.
       
    • In 2017–18, about half of Black or African American teachers (51 percent) taught in city schools, compared with 31 percent of all teachers. 
       
    • Black or African American teachers had a higher rate of post-master’s degree education (13 percent) than did all teachers (9 percent) in 2017–18.
       
    • In 2017–18, about two-thirds (66 percent) of Black or African American teachers taught in the South, compared with 39 percent of all teachers.

 

Postsecondary Education

  • Students
    • Female enrollment at HBCUs has been higher than male enrollment in every year since 1976.
       
    • In fall 2019, nearly 2.5 million Black students were enrolled in a degree-granting postsecondary institution, compared with the 1.0 million who were enrolled in fall 1976.
       
    • In 2019–20, postsecondary institutions awarded 55,642 STEM degrees/certificates to Black students.


       
  • Faculty and Institutions
    • In fall 2019, there were 27,323 full-time Black female faculty members at degree-granting postsecondary institutions, compared with 19,874 Black male faculty members.
       
    • In fall 2020, there were 101 degree-granting Historically Black Colleges and Universities (HBCUs) located in the 50 states, D.C., and the U.S. Virgin Islands—52 public institutions and 49 private nonprofit institutions.
       

By Kyle Argueta, AIR

 

Unexpected Benefits of Conducting Cost-Effectiveness Analysis

This is the first of a two-part guest blog series from an IES-funded partnership project between Teachers College, Columbia University, American University, and Jefferson County Public Schools in Kentucky. The purpose of the project is to explore academic return on investment (AROI) as a metric for improving decision-making around education programs that lead to improvements in student education outcomes. In this guest blog entry, the team showcases cost analysis as an integral part of education program evaluation.

Here’s a fun thought experiment (well, at least fun for researcher-types). Imagine you just discovered that two of your district partner’s firmly entrenched initiatives are not cost-effective. What would you do? 

Now, would your answer change if we told you that the findings came amidst a global pandemic and widespread social unrest over justice reform, and that those two key initiatives were a school nurse program and restorative practices? That’s the exact situation we faced last year in Jefferson County Public Schools (JCPS) in Louisville, KY. Fortunately, the process of conducting rigorous cost analyses of these programs unearthed critical evidence to help explain mostly null impact findings and inform very real changes in practice at JCPS.

Cost-Effectiveness Analysis Revealed Missing Program Components

Our team of researchers from Teachers College, Columbia University and American University, and practitioners from JCPS had originally planned to use cost-effectiveness analysis (CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions. With the gracious support of JCPS program personnel in executing our CEAs, we obtained a treasure trove of additional quantitative and qualitative cost and implementation data, which proved to be invaluable.

Specifically, for the district’s school nurse program, the lack of an explicit theory of change, of standardized evidence-based practices across schools, and of a monitoring plan were identified as potential explanations for our null impact results. In one of our restorative practices cost interviews, we discovered that a key element of the program, restorative conferences, was not being implemented at all due to time constraints and staffing challenges, which may help explain the disappointing impact results.

Changes in Practice

In theory, our CEA findings indicated that JCPS should find more cost-effective alternatives to school nursing and restorative practices. In reality, however, both programs were greatly expanded; school nursing in response to COVID and restorative practices because JCPS leadership has committed to moving away from traditional disciplinary practices. Our findings regarding implementation, however, lead us to believe that key changes can lead to improved student outcomes for both.

In response to recommendations from the team, JCPS is developing a training manual for new nurses, a logic model illustrating how specific nursing activities can lead to better outcomes, and a monitoring plan. For restorative practices, while we still have a ways to go, the JCPS team is continuing to work with program personnel to improve implementation.

One encouraging finding from our CEA was that, despite imperfect implementation, suspension rates for Black students were lower in schools that had implemented restorative practices for two years compared to Black students in schools implementing the program for one year. Our hope is that further research will identify the aspects of restorative practices most critical for equitably improving school discipline and climate.

Process and Partnership

Our experience highlights unexpected benefits that can result when researchers and practitioners collaborate on all aspects of cost-effectiveness analysis, from collecting data to applying findings to practice. In fact, we are convinced that the ongoing improvements discussed here would not have been possible apart from the synergistic nature of our partnership. While the JCPS team included seasoned evaluators and brought front-line knowledge of program implementation, information systems, data availability, and district priorities, our research partners brought additional research capacity, methodological expertise, and a critical outsider’s perspective.

Together, we discovered that the process of conducting cost-effectiveness analysis can provide valuable information normally associated with fidelity of implementation studies. Knowledge gained during the cost analysis process helped to explain our less-than-stellar impact results and led to key changes in practice. In the second blog of this series, we’ll share how the process of conducting CEA and value-added analysis led to changes in practice extending well beyond the specific programs we investigated.


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Robert Shand is an Assistant Professor at American University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

Challenging Implicit Bias in Schools

School environments are places in which students, particularly students of color, are exposed to implicit bias and discrimination that can negatively impact their academic outcomes. In this interview blog, we asked prevention scientist Dr. Chynna McCall to discuss how her career journey and her experiences working with children and families from diverse populations inspired her research on creating equitable school environments.   

 

Chynna McCall PhotoHow did you begin your career journey as a prevention scientist?

Perhaps my most valued professional experience is serving as a licensed school psychologist in public schools in Colorado, working with children and families from racially, culturally, and linguistically diverse populations. This experience inspired me to join the Missouri Prevention Science Institute in 2018 as an Institute of Education Sciences postdoctoral fellow, where I studied how to use research to solve real-world problems. More specifically, I learned how to use prevention science to develop and evaluate evidence-based practices and interventions that prevent negative social and emotional impacts before they happen. After my fellowship, I was hired and promoted to a senior research associate position at the Missouri Prevention Science Institute. In this role, I have operational responsibilities for various federally funded grants and conduct my own grant-funded research. Presently, I am working on the development and testing of an equity-focused social-emotional learning curriculum for 3rd through 5th grade students.

What challenges did you observe as a school psychologist?

As a school psychologist, I worked in two vastly different school districts. In one, most students came from low-income families, spoke English as a second language, and the school's performance on standardized tests was significantly below average. Most of the challenges I tackled during my time there could be categorized as social-emotional; most students had unbalanced home lives, and many suffered emotional or physical trauma. Because the school district pressured teachers to improve test scores, focus on behavior and classroom management unilaterally shifted towards scholastics. The unfortunate outcome was neglecting to acknowledge the role that student behavior and the root causes of those behaviors play in affecting academic outcomes. While the second district I worked for was a high-performing one with generally high socioeconomic status, I chose to work for the school designated for those children in the district who have serious emotional disabilities.

Even though there are stark differences between the two districts, I consistently encountered a need for students to develop better relationships with their teachers, peers, and parents, develop a better sense of self, and for teachers, other school personnel, students, and parents to have a better understanding of how their practices and interactions are impacting student social-emotional and academic outcomes.

How does your background as a school psychologist influence your research?

My experience as a school psychologist has reinforced my understanding of what is needed to improve public education and what research questions are of utmost importance. Through my time as a school psychologist, it helped me define the goals of my research, which include 1) understanding the influence of prejudice and discrimination on student internal and external behaviors and outcomes, 2) understanding how school personnel expression of prejudice and discrimination influence student internal and external behaviors and outcomes, and 3) determining how to most effectively develop an equitable school environment that positively influences marginalized and minoritized youth outcomes.

My research examines how school environment—including the prejudicial and discriminatory thoughts and behaviors of school staff, students, and guardians—influences identity development, identity expression (for example, racial identity, gender identity, sexuality, and intersectionality) and internal and external behaviors. The objective is to use this knowledge to create a school environment that facilitates prosocial student identity development. My research hinges on my observations and experiences as a practicing school psychologist to focus on how to shift differential outcomes observed in public education due to experiences of discrimination both in and out of the school setting.

In your area of research, what do you see as the greatest research needs or recommendations to address diversity and equity and improve the relevance of education research for diverse communities of students and families?

I believe schools at every level of education are microcosms for the greater society. How students traverse through the school system dictates how they will navigate through the macrocosm of society. How students navigate the school system can be improved if school systems are equipped with the tools that allow staff to prepare the students better academically, socially, and emotionally. These tools are essential for students who are having a difficult time because of cultural, linguistic, psychological, or physical differences from their peers. It is crucial for the research community to continually advocate for positive change in our education system, work towards better understanding student needs, and develop effective and efficient tools that better promote student growth and outcomes.

I also believe that researchers who study school environments must explicitly study bias. We have to look at whether and how school professionals are becoming aware of and challenging their implicit biases, as well as how students are becoming aware of bias and how they deal with it—either by internalizing it or challenging it. We also must look into how challenging or accepting bias affects students emotionally, behaviorally, and academically.

What advice would you give to emerging scholars from underrepresented, minoritized groups that are pursuing a career in education research?

See your perspective and experience as assets. Your perspective is underrepresented and is needed in making necessary changes to education and education outcomes. When you view your perspective as something of value, you are better able to determine what unaddressed research questions need to be asked and to move education research in a direction that is more inclusive of every student.


This year, Inside IES Research is publishing a series of interviews (see here, here, and here) showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice. As part of our Black History Month blog series, we are focusing on African American/Black researchers and fellows as well as researchers who focus on the education of Black students.

Dr. Chynna McCall is a Senior Research Associate with the Missouri Prevention Science Institute at the University of Missouri. Prior to this position, she was an IES postdoctoral fellow in the Missouri Interdisciplinary Postdoctoral Research and Training Program training program.

Produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), postdoctoral training program officer, and Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council and predoctoral training program officer.