Inside IES Research

Notes from NCER & NCSER

Unexpected Value from Conducting Value-Added Analysis

This is the second of a two-part blog series from an IES-funded partnership project. The first part described how the process of cost-effectiveness analysis (CEA) provided useful information that led to changes in practice for a school nurse program and restorative practices at Jefferson County Public Schools (JCPS) in Louisville, KY. In this guest blog, the team discusses how the process of conducting value-added analysis provided useful program information over and above the information they obtained via CEA or academic return on investment (AROI).

Since we know you loved the last one, it’s time for another fun thought experiment! Imagine that you have just spent more than a year gathering, cleaning, assembling, and analyzing a dataset of school investments for what you hope will be an innovative approach to program evaluation. Now imagine the only thing your results tell you is that your proposed new application of value-added analysis (VAA) is not well-suited for these particular data. What would you do? Well, sit back and enjoy another round of schadenfreude at our expense. Once again, our team of practitioners from JCPS and researchers from Teachers College, Columbia University and American University found itself in a very unenviable position.

We had initially planned to use the rigorous VAA (and CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions on existing school- and district-level investments. Although the three methods—VAA, CEA, and AROI—vary in rigor and address slightly different research questions, we expected that their results would be both complementary and comparable for informing decisions to reinvest, discontinue, expand/contract, or make other implementation changes to an investment. To that end, we set out to test our hypothesis by comparing results from each method across a broad spectrum of investments. Fortunately, as with CEA, the process of conducting VAA provided additional, useful program information that we would not have otherwise obtained via CEA or AROI. This unexpected information, combined with what we’d learned about implementation from our CEAs, led to even more changes in practice at JCPS.

Data Collection for VAA Unearthed Inadequate Record-keeping, Mission Drift, and More

Our AROI approach uses existing student and budget data from JCPS’s online Investment Tracking System (ITS) to compute comparative metrics for informing budget decisions. Budget request proposals submitted by JCPS administrators through ITS include information on target populations, goals, measures, and the budget cycle (1-5 years) needed to achieve the goals. For VAA, we needed similar, but more precise, data to estimate the relative effects of specific interventions on student outcomes, which required us to contact schools and district departments to gather the necessary information. Our colleagues provided us with sufficient data to conduct VAA. However, during this process, we discovered instances of missing or inadequate participant rosters; mission drift in how requested funds were actually spent; and mismatches between goals, activities, and budget cycles. We suspect that JCPS is not alone in this challenge, so we hope that what follows might be helpful to other districts facing similar scenarios.

More Changes in Practice 

The lessons learned during the school nursing and restorative practice CEAs discussed in the first blog, and the data gaps identified through the VAA process, informed two key developments at JCPS. First, we formalized our existing end-of-cycle investment review process by including summary cards for each end-of-cycle investment item (each program or personnel position in which district funds were invested) indicating where insufficient data (for example, incomplete budget requests or unavailable participation rosters) precluded AROI calculations. We asked specific questions about missing data to elicit additional information and to encourage more diligent documentation in future budget requests. 

Second, we created the Investment Tracking System 2.0 (ITS 2.0), which now requires budget requesters to complete a basic logic model. The resources (inputs) and outcomes in the logic model are auto-populated from information entered earlier in the request process, but requesters must manually enter activities and progress monitoring (outputs). Our goal is to encourage and facilitate development of an explicit theory of change at the outset and continuous evidence-based adjustments throughout the implementation. Mandatory entry fields now prevent requesters from submitting incomplete budget requests. The new system was immediately put into action to track all school-level Elementary and Secondary School Emergency Relief (ESSER)-related budget requests.

Process and Partnership, Redux

Although we agree with the IES Director’s insistence that partnerships between researchers and practitioners should be a means to (eventually) improving student outcomes, our experience shows that change happens slowly in a large district. Yet, we have seen substantial changes as a direct result of our partnership. Perhaps the most important change is the drastic increase in the number of programs, investments, and other initiatives that will be evaluable as a result of formalizing the end-of-cycle review process and creating ITS 2.0. We firmly believe these changes could not have happened apart from our partnership and the freedom our funding afforded us to experiment with new approaches to addressing the challenges we face.   


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Robert Shand is an Assistant Professor at American University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

 

Unexpected Benefits of Conducting Cost-Effectiveness Analysis

This is the first of a two-part guest blog series from an IES-funded partnership project between Teachers College, Columbia University, American University, and Jefferson County Public Schools in Kentucky. The purpose of the project is to explore academic return on investment (AROI) as a metric for improving decision-making around education programs that lead to improvements in student education outcomes. In this guest blog entry, the team showcases cost analysis as an integral part of education program evaluation.

Here’s a fun thought experiment (well, at least fun for researcher-types). Imagine you just discovered that two of your district partner’s firmly entrenched initiatives are not cost-effective. What would you do? 

Now, would your answer change if we told you that the findings came amidst a global pandemic and widespread social unrest over justice reform, and that those two key initiatives were a school nurse program and restorative practices? That’s the exact situation we faced last year in Jefferson County Public Schools (JCPS) in Louisville, KY. Fortunately, the process of conducting rigorous cost analyses of these programs unearthed critical evidence to help explain mostly null impact findings and inform very real changes in practice at JCPS.

Cost-Effectiveness Analysis Revealed Missing Program Components

Our team of researchers from Teachers College, Columbia University and American University, and practitioners from JCPS had originally planned to use cost-effectiveness analysis (CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions. With the gracious support of JCPS program personnel in executing our CEAs, we obtained a treasure trove of additional quantitative and qualitative cost and implementation data, which proved to be invaluable.

Specifically, for the district’s school nurse program, the lack of an explicit theory of change, of standardized evidence-based practices across schools, and of a monitoring plan were identified as potential explanations for our null impact results. In one of our restorative practices cost interviews, we discovered that a key element of the program, restorative conferences, was not being implemented at all due to time constraints and staffing challenges, which may help explain the disappointing impact results.

Changes in Practice

In theory, our CEA findings indicated that JCPS should find more cost-effective alternatives to school nursing and restorative practices. In reality, however, both programs were greatly expanded; school nursing in response to COVID and restorative practices because JCPS leadership has committed to moving away from traditional disciplinary practices. Our findings regarding implementation, however, lead us to believe that key changes can lead to improved student outcomes for both.

In response to recommendations from the team, JCPS is developing a training manual for new nurses, a logic model illustrating how specific nursing activities can lead to better outcomes, and a monitoring plan. For restorative practices, while we still have a ways to go, the JCPS team is continuing to work with program personnel to improve implementation.

One encouraging finding from our CEA was that, despite imperfect implementation, suspension rates for Black students were lower in schools that had implemented restorative practices for two years compared to Black students in schools implementing the program for one year. Our hope is that further research will identify the aspects of restorative practices most critical for equitably improving school discipline and climate.

Process and Partnership

Our experience highlights unexpected benefits that can result when researchers and practitioners collaborate on all aspects of cost-effectiveness analysis, from collecting data to applying findings to practice. In fact, we are convinced that the ongoing improvements discussed here would not have been possible apart from the synergistic nature of our partnership. While the JCPS team included seasoned evaluators and brought front-line knowledge of program implementation, information systems, data availability, and district priorities, our research partners brought additional research capacity, methodological expertise, and a critical outsider’s perspective.

Together, we discovered that the process of conducting cost-effectiveness analysis can provide valuable information normally associated with fidelity of implementation studies. Knowledge gained during the cost analysis process helped to explain our less-than-stellar impact results and led to key changes in practice. In the second blog of this series, we’ll share how the process of conducting CEA and value-added analysis led to changes in practice extending well beyond the specific programs we investigated.


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Robert Shand is an Assistant Professor at American University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

Challenging Implicit Bias in Schools

School environments are places in which students, particularly students of color, are exposed to implicit bias and discrimination that can negatively impact their academic outcomes. In this interview blog, we asked prevention scientist Dr. Chynna McCall to discuss how her career journey and her experiences working with children and families from diverse populations inspired her research on creating equitable school environments.   

 

Chynna McCall PhotoHow did you begin your career journey as a prevention scientist?

Perhaps my most valued professional experience is serving as a licensed school psychologist in public schools in Colorado, working with children and families from racially, culturally, and linguistically diverse populations. This experience inspired me to join the Missouri Prevention Science Institute in 2018 as an Institute of Education Sciences postdoctoral fellow, where I studied how to use research to solve real-world problems. More specifically, I learned how to use prevention science to develop and evaluate evidence-based practices and interventions that prevent negative social and emotional impacts before they happen. After my fellowship, I was hired and promoted to a senior research associate position at the Missouri Prevention Science Institute. In this role, I have operational responsibilities for various federally funded grants and conduct my own grant-funded research. Presently, I am working on the development and testing of an equity-focused social-emotional learning curriculum for 3rd through 5th grade students.

What challenges did you observe as a school psychologist?

As a school psychologist, I worked in two vastly different school districts. In one, most students came from low-income families, spoke English as a second language, and the school's performance on standardized tests was significantly below average. Most of the challenges I tackled during my time there could be categorized as social-emotional; most students had unbalanced home lives, and many suffered emotional or physical trauma. Because the school district pressured teachers to improve test scores, focus on behavior and classroom management unilaterally shifted towards scholastics. The unfortunate outcome was neglecting to acknowledge the role that student behavior and the root causes of those behaviors play in affecting academic outcomes. While the second district I worked for was a high-performing one with generally high socioeconomic status, I chose to work for the school designated for those children in the district who have serious emotional disabilities.

Even though there are stark differences between the two districts, I consistently encountered a need for students to develop better relationships with their teachers, peers, and parents, develop a better sense of self, and for teachers, other school personnel, students, and parents to have a better understanding of how their practices and interactions are impacting student social-emotional and academic outcomes.

How does your background as a school psychologist influence your research?

My experience as a school psychologist has reinforced my understanding of what is needed to improve public education and what research questions are of utmost importance. Through my time as a school psychologist, it helped me define the goals of my research, which include 1) understanding the influence of prejudice and discrimination on student internal and external behaviors and outcomes, 2) understanding how school personnel expression of prejudice and discrimination influence student internal and external behaviors and outcomes, and 3) determining how to most effectively develop an equitable school environment that positively influences marginalized and minoritized youth outcomes.

My research examines how school environment—including the prejudicial and discriminatory thoughts and behaviors of school staff, students, and guardians—influences identity development, identity expression (for example, racial identity, gender identity, sexuality, and intersectionality) and internal and external behaviors. The objective is to use this knowledge to create a school environment that facilitates prosocial student identity development. My research hinges on my observations and experiences as a practicing school psychologist to focus on how to shift differential outcomes observed in public education due to experiences of discrimination both in and out of the school setting.

In your area of research, what do you see as the greatest research needs or recommendations to address diversity and equity and improve the relevance of education research for diverse communities of students and families?

I believe schools at every level of education are microcosms for the greater society. How students traverse through the school system dictates how they will navigate through the macrocosm of society. How students navigate the school system can be improved if school systems are equipped with the tools that allow staff to prepare the students better academically, socially, and emotionally. These tools are essential for students who are having a difficult time because of cultural, linguistic, psychological, or physical differences from their peers. It is crucial for the research community to continually advocate for positive change in our education system, work towards better understanding student needs, and develop effective and efficient tools that better promote student growth and outcomes.

I also believe that researchers who study school environments must explicitly study bias. We have to look at whether and how school professionals are becoming aware of and challenging their implicit biases, as well as how students are becoming aware of bias and how they deal with it—either by internalizing it or challenging it. We also must look into how challenging or accepting bias affects students emotionally, behaviorally, and academically.

What advice would you give to emerging scholars from underrepresented, minoritized groups that are pursuing a career in education research?

See your perspective and experience as assets. Your perspective is underrepresented and is needed in making necessary changes to education and education outcomes. When you view your perspective as something of value, you are better able to determine what unaddressed research questions need to be asked and to move education research in a direction that is more inclusive of every student.


This year, Inside IES Research is publishing a series of interviews (see here, here, and here) showcasing a diverse group of IES-funded education researchers and fellows that are making significant contributions to education research, policy, and practice. As part of our Black History Month blog series, we are focusing on African American/Black researchers and fellows as well as researchers who focus on the education of Black students.

Dr. Chynna McCall is a Senior Research Associate with the Missouri Prevention Science Institute at the University of Missouri. Prior to this position, she was an IES postdoctoral fellow in the Missouri Interdisciplinary Postdoctoral Research and Training Program training program.

Produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), postdoctoral training program officer, and Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council and predoctoral training program officer.

 

Lessons Learned as the Virginia Education Science Training (VEST) Program Creates Pathways for Diverse Students into Education Science

Since 2004, the Institute of Education Sciences has funded predoctoral training programs to increase the number of well-trained PhD students who are prepared to conduct rigorous and relevant education research. In addition to providing training to doctoral students, all IES-funded predoctoral programs are encouraged to help broaden participation in the education sciences as part of their leadership activities. In this guest blog post, the leadership team of the University of Virginia predoctoral training program discusses their continuing efforts to create diverse pathways for students interested in education research.

In 2008, the IES-funded Virginia Education Science Training (VEST) Program began the Summer Undergraduate Research Program (SURP) with the goal of recruiting more students from traditionally marginalized groups into education science research. Each year, 8–10 students from around the United States traveled to receive faculty mentorship in independent research at the University of Virginia. In doing so, they experienced facilitated opportunities to develop new research skills and reflect about their own identities as scholars and students of color, first generation college students and/or students from families with low income. They became active members of research groups, visited IES program officers in Washington, DC, and presented their own research at the Leadership Alliance National Symposium.

Quite fortuitously, at an IES principal investigator meeting, we connected with the leadership of the IES-funded Research Institute for Scholars of Equity (RISE) program taking place at North Carolina Central University (NCCU). As a result, for four years, we collaborated with RISE leadership to host two-day RISE fellow visits to UVA. During these visits RISE fellows shared their projects and ideas with VEST fellows and faculty. The RISE and SURP fellows also mingled and attended workshops on graduate school admissions.

We had three goals for these efforts:

  • Provide IES pre-doctoral fellows with the opportunity to apply leadership skills to working with undergraduates
  • Increase the diversity of education scientists
  • Increase the diversity of our IES-sponsored PhD program

Enter COVID. In 2020, bringing students to UVA for the summer wasn’t feasible or wise. Instead, we reflected on our past successful experiences with NCCU and realized we could improve the quality of student experiences if we also worked closely with faculty at other universities. To start, we engaged with Virginia State University (VSU) and Norfolk State University (NSU), two Virginia HBCUs, to create the Open Doors Program.

Initially, eight faculty and administrators from NSU and VSU met with the UVA team, which included a post-doctoral fellow and a PhD student who coordinated discussions, helped design the curriculum, and built an Open Doors handbook. The design team built a program in which 12 rising juniors at NSU and VSU would:

  • Engage in the research and writing process that will lead to a research product and presentation that reflects their strengths, interests, and goals
  • Gain a deeper understanding of the opportunities available to them in graduate school
  • Have the opportunity to examine the complexities and multiple layers of their intersectional identities, identify assets and cultural wealth, and identify academic strengths and areas of growth
  • Build relationships with faculty and graduate student mentors

Due to the pandemic, the program was offered virtually over four weeks with a combination of seminars and mentoring sessions. The program exceeded our expectations. The students all indicated that Open Doors was a useful learning experience for them and provided them with a better understanding of the opportunities available in graduate school. The faculty valued the opportunity to work with each other. We will be offering Open Doors 2.0 next June with another cohort of 12 students from NSU and VSU. We learned a lot from our first year and have planned several modifications to the program. For example, this year, we anticipate that students and some NSU and VSU faculty will be on campus at UVA for two of the four weeks; the other two weeks will be virtual.

These efforts have been true learning experiences for UVA faculty and VEST fellows. We have several recommendations for other programs eager to create pathways programs.

  • Clarify your goals and organize the program around the key outcomes that you are trying to achieve. For SURP and Open Doors, we focused in on four outcomes: preparation to conduct education research, preparation for graduate school, expansion of networks, and providing access to new mentoring relationships.
  • Teach skills as well as knowledge. Our evaluation of SURP points to the importance of teaching skills so students can formulate research questions, recognize research designs, analyze and interpret data, and write about research. Students reported gaining skills in these areas which are critical to success in graduate school in education research.
  • Identify ways to enhance cultural capital. Students benefit from knowledge, familiarity, and comfort with university life. In Open Doors, we wanted to build an authentic collaboration that allowed faculty, graduate students, and undergraduate students at the HBCUs and UVA to learn from each other, extending the cultural capital of all participants.

Our efforts have been exciting yet humbling. Above all, we enjoy listening to and learning from the SURP and Open Doors students. In Open Doors, we also enjoyed building relationships with faculty at other institutions. We have increasingly become aware of the challenges we face in efforts to increase the diversity of our programs. Recruitment is just a first step. Creating graduate school experiences that are conducive to learning and engagement for students from diverse group is an important second step. And a third critical step is to transform life at our universities so that students (and faculty) from traditionally marginalized groups can thrive and flourish. In doing so, we expect that universities will be better able to meet a full range of new challenges that lie ahead in education science.

 


Sara Rimm-Kaufman is the Commonwealth Professor of Education in the Educational Psychology-Applied Developmental Science program at the University of Virginia School of Education and Human Development.

Jim Wyckoff is the Memorial Professor of Education and Public Policy in the Education Policy program and directs the Center on Education Policy and Workforce Competitiveness at the University of Virginia.

Jamie Inlow is the Coordinator for the VEST Predoctoral Training Program in the University of Virginia School of Education and Human Development.

This blog post is part of an ongoing series featuring IES training programs as well as our blog series on diversity, equity, inclusion, and accessibility (DEIA) within IES grant programs.

Produced by Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council and predoctoral training program officer.

DE21: A Researcher-Practitioner-Policymaker Conference on Dual Enrollment

Dual enrollment improves student college going and postsecondary success, but practitioners need help in understanding the impact of dual enrollment and in learning strategies associated with effective and equitable implementation. Under the auspices of the IES-funded Evaluation of Career and College Promise (CCP) project, the North Carolina Community College System suggested hosting a conference to build knowledge and capacity in the field about dual enrollment. The Evaluation of CCP is a partnership with the SERVE Center at the University of North Carolina at Greensboro, the North Carolina Department of Public Instruction, the North Carolina Community College System, and the RAND Corporation. In addition to the research goals—which involve looking at the implementation, impact, and cost of North Carolina’s dual enrollment program—the project also has a goal of capacity development for the agencies and for practitioners. As part of meeting this last goal, the project recently hosted a conference on Dual Enrollment: Accelerating Educational Attainment (DE21) with over 1,000 registrants from North Carolina and around the country.      

Julie Edmunds, the project’s principal investigator, discusses the DE21 conference.

Why host a conference on dual enrollment?

This was the brainchild of our partners at the North Carolina Community College System. They wanted to create an opportunity where researchers and practitioners could gather and share lessons learned from their respective work. The NC Community College System expected that we would be learning a lot from our project that we would want to share; they also knew that the people in the trenches had many valuable insights to help bridge the gap between research and practice. Because existing research shows that not all groups of students have the same access to dual enrollment, the project team decided collectively that the conference should have a strong focus on equity and to use the conference as a way to communicate and discuss strategies to support equity.

What happened at the conference?

We had a total of 40 sessions across two full days. There were dynamic keynote speakers, including Karen Stout from Achieving the Dream, and panels that discussed dual enrollment from the policy, research, student and parent perspectives. Although there was a strong North Carolina focus, there were sessions from other states such as Massachusetts, Texas, Indiana, and Ohio.

Conference presentations were organized into five themes: expanding access and equity, fostering college attainment, ensuring a successful transition to college and careers, preparing students for dual enrollment, and supporting success in dual enrollment courses.

The CCP study team presented findings from our evaluation of North Carolina’s dual enrollment pathways. We looked at individual and school-level factors associated with dual enrollment participation, such as student demographics, school size, locale, percentage of students from underrepresented minority groups, academic achievement, and workforce-orientation of students. Student socioeconomic level did not affect participation in dual enrollment. We also presented preliminary impacts of North Carolina’s three different dual enrollment pathways (college transfer, Career and Technical Education, and Cooperative Innovative High Schools or early colleges). Results from these three pathways showed that CCP participants had better high school outcomes such as higher school graduation rates and were more likely to enroll in postsecondary education. In addition, there were multiple sessions sharing research results from other states.

There were many presentations from practitioners that focused on topics like rigorous instruction, advising, participation of students with disabilities, creating strong secondary-postsecondary partnerships, using high school teachers as college instructors, among others. I need to give a huge shoutout to Katie Bao from the NC Community College System, who shepherded us all through the conference planning and implementation process.

What was the impact of the pandemic?

When we originally planned for the conference, we thought it would be in person. After the pandemic hit, we decided (as many other organizations did) to host it virtually. This made the conference much more accessible to a national audience, and we had participants and presenters from around the country.

What if someone missed the conference?

Another benefit of a virtual conference is that we are able to share all the sessions from the meeting. Please visit our site on YouTube to listen to the conference. 

What comes next?

Our study work continues, and we will share the results in a variety of ways, including through briefs and journal articles. We are also planning to host a second conference in 2023 and expect that it will have a virtual component so that it can continue to be available to a national audience.


Dr. Julie Edmunds is a Program Director at the SERVE Center at the University of North Carolina at Greensboro. In addition to being the PI on the Evaluation of Career and College Promise, she is one of the leading researchers on early college, a model that combines high school and college.