Inside IES Research

Notes from NCER & NCSER

Unexpected Value from Conducting Value-Added Analysis

This is the second of a two-part blog series from an IES-funded partnership project. The first part described how the process of cost-effectiveness analysis (CEA) provided useful information that led to changes in practice for a school nurse program and restorative practices at Jefferson County Public Schools (JCPS) in Louisville, KY. In this guest blog, the team discusses how the process of conducting value-added analysis provided useful program information over and above the information they obtained via CEA or academic return on investment (AROI).

Since we know you loved the last one, it’s time for another fun thought experiment! Imagine that you have just spent more than a year gathering, cleaning, assembling, and analyzing a dataset of school investments for what you hope will be an innovative approach to program evaluation. Now imagine the only thing your results tell you is that your proposed new application of value-added analysis (VAA) is not well-suited for these particular data. What would you do? Well, sit back and enjoy another round of schadenfreude at our expense. Once again, our team of practitioners from JCPS and researchers from Teachers College, Columbia University and American University found itself in a very unenviable position.

We had initially planned to use the rigorous VAA (and CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions on existing school- and district-level investments. Although the three methods—VAA, CEA, and AROI—vary in rigor and address slightly different research questions, we expected that their results would be both complementary and comparable for informing decisions to reinvest, discontinue, expand/contract, or make other implementation changes to an investment. To that end, we set out to test our hypothesis by comparing results from each method across a broad spectrum of investments. Fortunately, as with CEA, the process of conducting VAA provided additional, useful program information that we would not have otherwise obtained via CEA or AROI. This unexpected information, combined with what we’d learned about implementation from our CEAs, led to even more changes in practice at JCPS.

Data Collection for VAA Unearthed Inadequate Record-keeping, Mission Drift, and More

Our AROI approach uses existing student and budget data from JCPS’s online Investment Tracking System (ITS) to compute comparative metrics for informing budget decisions. Budget request proposals submitted by JCPS administrators through ITS include information on target populations, goals, measures, and the budget cycle (1-5 years) needed to achieve the goals. For VAA, we needed similar, but more precise, data to estimate the relative effects of specific interventions on student outcomes, which required us to contact schools and district departments to gather the necessary information. Our colleagues provided us with sufficient data to conduct VAA. However, during this process, we discovered instances of missing or inadequate participant rosters; mission drift in how requested funds were actually spent; and mismatches between goals, activities, and budget cycles. We suspect that JCPS is not alone in this challenge, so we hope that what follows might be helpful to other districts facing similar scenarios.

More Changes in Practice 

The lessons learned during the school nursing and restorative practice CEAs discussed in the first blog, and the data gaps identified through the VAA process, informed two key developments at JCPS. First, we formalized our existing end-of-cycle investment review process by including summary cards for each end-of-cycle investment item (each program or personnel position in which district funds were invested) indicating where insufficient data (for example, incomplete budget requests or unavailable participation rosters) precluded AROI calculations. We asked specific questions about missing data to elicit additional information and to encourage more diligent documentation in future budget requests. 

Second, we created the Investment Tracking System 2.0 (ITS 2.0), which now requires budget requesters to complete a basic logic model. The resources (inputs) and outcomes in the logic model are auto-populated from information entered earlier in the request process, but requesters must manually enter activities and progress monitoring (outputs). Our goal is to encourage and facilitate development of an explicit theory of change at the outset and continuous evidence-based adjustments throughout the implementation. Mandatory entry fields now prevent requesters from submitting incomplete budget requests. The new system was immediately put into action to track all school-level Elementary and Secondary School Emergency Relief (ESSER)-related budget requests.

Process and Partnership, Redux

Although we agree with the IES Director’s insistence that partnerships between researchers and practitioners should be a means to (eventually) improving student outcomes, our experience shows that change happens slowly in a large district. Yet, we have seen substantial changes as a direct result of our partnership. Perhaps the most important change is the drastic increase in the number of programs, investments, and other initiatives that will be evaluable as a result of formalizing the end-of-cycle review process and creating ITS 2.0. We firmly believe these changes could not have happened apart from our partnership and the freedom our funding afforded us to experiment with new approaches to addressing the challenges we face.   


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Robert Shand is an Assistant Professor at American University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

 

Unexpected Benefits of Conducting Cost-Effectiveness Analysis

This is the first of a two-part guest blog series from an IES-funded partnership project between Teachers College, Columbia University, American University, and Jefferson County Public Schools in Kentucky. The purpose of the project is to explore academic return on investment (AROI) as a metric for improving decision-making around education programs that lead to improvements in student education outcomes. In this guest blog entry, the team showcases cost analysis as an integral part of education program evaluation.

Here’s a fun thought experiment (well, at least fun for researcher-types). Imagine you just discovered that two of your district partner’s firmly entrenched initiatives are not cost-effective. What would you do? 

Now, would your answer change if we told you that the findings came amidst a global pandemic and widespread social unrest over justice reform, and that those two key initiatives were a school nurse program and restorative practices? That’s the exact situation we faced last year in Jefferson County Public Schools (JCPS) in Louisville, KY. Fortunately, the process of conducting rigorous cost analyses of these programs unearthed critical evidence to help explain mostly null impact findings and inform very real changes in practice at JCPS.

Cost-Effectiveness Analysis Revealed Missing Program Components

Our team of researchers from Teachers College, Columbia University and American University, and practitioners from JCPS had originally planned to use cost-effectiveness analysis (CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions. With the gracious support of JCPS program personnel in executing our CEAs, we obtained a treasure trove of additional quantitative and qualitative cost and implementation data, which proved to be invaluable.

Specifically, for the district’s school nurse program, the lack of an explicit theory of change, of standardized evidence-based practices across schools, and of a monitoring plan were identified as potential explanations for our null impact results. In one of our restorative practices cost interviews, we discovered that a key element of the program, restorative conferences, was not being implemented at all due to time constraints and staffing challenges, which may help explain the disappointing impact results.

Changes in Practice

In theory, our CEA findings indicated that JCPS should find more cost-effective alternatives to school nursing and restorative practices. In reality, however, both programs were greatly expanded; school nursing in response to COVID and restorative practices because JCPS leadership has committed to moving away from traditional disciplinary practices. Our findings regarding implementation, however, lead us to believe that key changes can lead to improved student outcomes for both.

In response to recommendations from the team, JCPS is developing a training manual for new nurses, a logic model illustrating how specific nursing activities can lead to better outcomes, and a monitoring plan. For restorative practices, while we still have a ways to go, the JCPS team is continuing to work with program personnel to improve implementation.

One encouraging finding from our CEA was that, despite imperfect implementation, suspension rates for Black students were lower in schools that had implemented restorative practices for two years compared to Black students in schools implementing the program for one year. Our hope is that further research will identify the aspects of restorative practices most critical for equitably improving school discipline and climate.

Process and Partnership

Our experience highlights unexpected benefits that can result when researchers and practitioners collaborate on all aspects of cost-effectiveness analysis, from collecting data to applying findings to practice. In fact, we are convinced that the ongoing improvements discussed here would not have been possible apart from the synergistic nature of our partnership. While the JCPS team included seasoned evaluators and brought front-line knowledge of program implementation, information systems, data availability, and district priorities, our research partners brought additional research capacity, methodological expertise, and a critical outsider’s perspective.

Together, we discovered that the process of conducting cost-effectiveness analysis can provide valuable information normally associated with fidelity of implementation studies. Knowledge gained during the cost analysis process helped to explain our less-than-stellar impact results and led to key changes in practice. In the second blog of this series, we’ll share how the process of conducting CEA and value-added analysis led to changes in practice extending well beyond the specific programs we investigated.


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Robert Shand is an Assistant Professor at American University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

Student-Led Action Research as a School Climate Intervention and Core Content Pedagogy

Improving the social and emotional climate of schools has become a growing priority for educators and policymakers in the past decade. The prevailing strategies for improving school climate include social and emotional learning, positive behavioral supports, and trauma-informed approaches. Many of these strategies foreground the importance of students having a voice in intervention, as students are special experts in their own social and emotional milieus.

Parallel to this trend has been a push toward student-centered pedagogical approaches in high schools that are responsive to cultural backgrounds and that promote skills aligned with the demands of the modern workplace, like critical thinking, problem-solving, and collaboration. Culturally responsive and restorative teaching and problem- and project-based learning are prominent movements. In this guest blog, Dr. Adam Voight at Cleveland State University discusses an ongoing IES-funded Development and Innovation project taking place in Cleveland, Ohio that aims to develop and document the feasibility of a school-based youth participatory action research intervention.

 

Our project is exploring how youth participatory action research (YPAR) may help to realize two objectives—school climate improvement and culturally-restorative, engaged learning. YPAR involves young people leading a cycle of problem identification, data collection and analysis, and evidence-informed action. It has long been used in out-of-school and extracurricular spaces to promote youth development and effect social change. We are field testing its potential to fit within more formal school spaces.

Project HighKEY

The engine for our project, which we call Project HighKEY (High-school Knowledge and Education through YPAR), is a design team composed of high school teachers and students, district officials, and university researchers. It is built from the Cleveland Alliance for Education Research, a research-practice partnership between the Cleveland Metropolitan School District, Cleveland State University, and the American Institutes for Research. The design team meets monthly to discuss YPAR theory and fit with high school curriculum and standards and make plans for YPAR field tests in schools. We have created a crosswalk of the documented competencies that students derive from YPAR and high school standards in English language arts (ELA), mathematics, science, and social studies in Ohio. For example, one state ELA standard is “Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence,” and through YPAR students collect and analyze survey and interview data and use their findings to advocate for change related to their chosen topic. A state math standard is “Interpret the slope and the intercept of a linear model in the context of data,” and this process may be applied to survey data students collect through YPAR, making an otherwise abstract activity more meaningful to students.  

Assessing the Effectiveness of YPAR

Remaining open-minded about the various ways in which YPAR may or may not fit in different high school courses, we are currently testing its implementation in a pre-calculus course, a government course, an English course, and a life-skills course. For example, a math teacher on our design team has built her statistics unit around YPAR. Students in three separate sections of the course have worked in groups of two or three to identify an issue and create a survey that is being administered to the broader student body. These issues include the lack of extracurricular activities, poor school culture, and unhealthy breakfast and lunch options. Their survey data will be used as the basis for learning about representing data with plots, distributions, measures of center, frequencies, and correlation after the winter holiday. Our theory is that students will be more engaged when using their own data on topics of their choosing and toward the goal of making real change. Across all of our project schools, we are monitoring administrative data, student and teacher survey data, and interview data to assess the feasibility, usability, and student and school outcomes of YPAR.

Impact of COVID-19 and How We Adapted

We received notification of our grant award in March 2020, the same week that COVID-19 shut down K-12 schools across the nation. When our project formally began in July 2020, our partner schools were planning for a wholly remote school year, and we pivoted to hold design team meetings virtually and loosen expectations for teacher implementation. Despite these challenges, several successful YPAR projects during that first year—all of which were conducted entirely remotely—taught all of us much about how YPAR can happen in online spaces. This school year, students and staff are back to in-person learning, but, in addition to the ongoing pandemic, the crushing teacher shortage has forced us to continue to adapt. Whereas we once planned our design team meeting during the school day, we now meet after school due to a lack of substitute teachers, and we use creative technology to allow for mixed virtual and in-person attendance. Our leadership team is also spending a great deal of time in classrooms with teachers to assist those implementing for the first time. Our goal is to create a resource that teachers anywhere can use to incorporate YPAR into their courses. The product will be strengthened by the lessons we have learned from doing this work during these extraordinary times and the resulting considerations for how to deal with obstacles to implementation.


Adam Voight is the Director of the Center for Urban Education at Cleveland State University.

For questions about this grant, please contact Corinne Alfeld, NCER Program Officer, at Corinne.Alfeld@ed.gov.

How Do Education Leaders Access and Use Research Evidence?

In 2014, IES funded one of two R&D Centers on Knowledge Utilization (or Knowledge/Evidence Use) to explore how 1) education researchers can make their work more relevant and useful to practitioners located in state and local education agencies and in individual schools, 2) the work of practitioners can inform research efforts, and 3) practitioners can make decisions based on research evidence. The National Center for Research in Policy and Practice (NCRPP) at the University of Colorado at Boulder recently completed its grant. Corinne Alfeld, a Program Officer in IES NCER, talked with Principal Investigator Bill Penuel about the Center’s findings and recommendations. The full list of Center staff and collaborators can be found on NCRPP’s website.

What were the outcomes of your IES-funded KU R&D Center? 

Our center studied how school and district leaders accessed and used research through a nationally representative survey and through case studies of school district decision making, research-practice partnerships, and a professional association of state leaders in science. Across all studies, we found that leaders highly valued research. At the same time, we found some things that might be surprising to many researchers:

  • Most leaders accessed research through their relationships and networks rather than through web sites, journal articles, or resources like the What Works Clearinghouse. The most common ways leaders accessed research was through their own professional associations, conferences, and colleagues in education settings. In some cases, these networks provided leaders with access to high-quality research. In our study of the professional association of science leaders, leaders cited National Academies of Sciences, Engineering, and Medicine consensus study reports as the most shared and used among members.
  • Leaders used research for a variety of purposes and for a range of decision-making activities not only to make decisions about what programs to adopt. Many leaders, for example, design professional development for educators for which there may be no program to adopt. In our case studies of district research use, we found that leaders did rely on research—conceptually—to inform their design activities. Research use was embedded within the ongoing decision-making routines for designing, implementing, and evaluating professional development activities.
  • Leaders did not turn to impact studies of individual interventions when they looked for research. Instead, they more commonly turned to books and other kinds of publications that provided syntheses or summaries of research. A common thread was that these sources of research provided frameworks for action, broken down into clear steps they could follow.

How can researchers use your findings to improve their dissemination efforts?

Despite the value leaders placed on research, there is clearly room for improvement with respect to dissemination. More than half of respondents to our survey said that by the time research was published, it was no longer valuable to them. Here are two strategies that researchers might find useful.

  1. Long-term research-practice partnerships (RPPs) were sites where leaders found research to be both timely and relevant to them. These partnerships come in different shapes and sizes and have different goals. However, all engage educators in helping define the very questions that will be addressed in the research, and some also engage in co-design and testing of interventions to address persistent problems of practice and to work toward visions of more equitable systems of education.
  2. Embedding researchers within leaders’ professional organizations can help disseminate research in a timely manner. These members present regularly at association meetings and conferences. They also participate in committees, where they develop tools to inform ongoing leadership activities.

What these strategies have in common is that dissemination is not an afterthought. In fact, dissemination is not a good word for what these researchers are doing. A better word is engagement. Researchers are engaging educators throughout the process of research and development, not just at the end.

How can your findings be used to improve practitioner access to and use or consideration of research findings?

Many of the strategies for engagement involve educators engaging with various aspects of research directly. Years ago, Weiss and colleagues described this as its own form of research use, called process use. Engaging educators in the actual research process does something that is important for supporting research use, namely giving them time to make sense of research and its implications for their work.

We found evidence that involvement in RPPs for educators was helpful to their own policymaking and practice. More than three-quarters, for example, said that their external partners shaped the design of professional development, and many also said that their partnership helped to integrate newly developed practices in the partnership.

What are your plans for future work in this area?

At present, NCRPP is involved in two exciting new projects. The first is a project funded by the William T. Grant Foundation, which focuses on developing and validating measures to assess the effectiveness of RPPs. We’re using a framework developed by Erin Henrick and colleagues to evaluate RPPs, and we’ve gathered survey and interview data from more than 60 RPPs. Our goal is to develop formative measures to help RPPs evaluate progress on each of the five dimensions of the framework.

The second project is funded by the Wallace Foundation to study and support equity-centered leadership and districts in forming partnerships with researchers as they develop and test strategies for creating equity-centered leadership pipelines. Both projects are being undertaken in collaboration with the National Network of Education Research-Practice Partnerships.

What do you see as the next steps for this field?

While it is tempting to suggest “more research is needed,” what is needed is an “evidence-informed” approach to evidence use—the application of what we already know about evidence use when it comes to policy and practice. That requires us to shift focus away from imagining that better, plain-language research briefs will help us improve research use. Instead, we need to encourage researchers to engage in more substantive ways with practice throughout the research process, to improve its relevance and timeliness.

We also need to embrace a broader conception of the kinds of evidence and information that can inform decision making, one that reflects the range of information that leaders currently use and could turn to. Of particular importance is considering the experiences of those students, families, and communities to whom we owe a great education debt as important sources for decision making. If we take a broader view of evidence, a new question emerges: How can we consider and integrate different sources of evidence in a way that is informed by values such as equity and justice into decision making? This is the sort of question I hope the field can pursue in the future.


Findings from the 2015 IES-funded Center for Research Use in Education (CRUE) at the University of Delaware will be highlighted in a blog in 2022. Stay tuned! If you have further questions, please contact Corinne.Alfeld@ed.gov.

 

Partnering with Practitioners to Address Mental Health in Rural Communities

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In 2017, Dr. Wendy Reinke, University of Missouri, received IES funding to formalize the Boone County School’s Mental Health Coalition by strengthening their partnership and validating the Early Identification System (EIS) to screen for social, emotional, behavioral, and academic risk among K-12 students in rural schools. Building on these successes, Dr. Reinke now leads the National Center for Rural School Mental Health (NCRSMH), a consortium of researchers leading efforts to advance mental health screening and support in rural communities.

Bennett Lunn, a Truman-Albright Fellow at IES, asked Dr. Reinke about the work of the original partnership and how it has informed her efforts to build new partnerships with other rural schools around the country. Below are her responses.

 

What was the purpose of the Boone County Schools Mental Health Coalition and what inspired you to do this work?

In 2015, our county passed an ordinance in which a small percent of our sales tax is set aside to support youth mental health in our community. As a result, the schools had visits from many of the local mental health agencies to set up services in school buildings. The superintendents quickly realized that it would be wise to have a more coordinated effort across school districts. They formed a coalition and partnered with researchers at the University of Missouri to develop a comprehensive model to prevent and intervene in youth mental health problems. The enthusiasm of our school partners and their willingness to consider research evidence to inform the model was so energizing! We were able to build a multi-tiered prevention and intervention framework that uses universal screening data to inform supports. In addition, we were awarded an IES partnership grant to help validate the screener, conduct focus groups and surveys of stakeholders to understand the feasibility and social validity of the model, and determine how fidelity to the model is related to student outcomes. The EIS is now being used in 54 school buildings across six school districts as part of their daily practice. 

 

Were there advantages to operating in a partnership to validate the screener?  

The main benefit of working in partnership with school personnel is that you learn what works under what circumstances from those directly involved in supporting students. We meet every month with the superintendents and other school personnel to ensure that if things are not working, we can find solutions before the problems become too big. We vote on any processes or procedure that were seen as needing to change. The meeting includes school personnel sharing the types of activities they are doing in their buildings so that others can replicate those best practices, and we meet with students to get their perspectives on what is working. In addition, the university faculty bring calls for external funding of research to the group to get ideas for what types of research would be appropriate and beneficial to the group. Schools are constantly changing and encountering new challenges. Being close to those who are working in the buildings allows for us to work together in forming and implementing feasible solutions over time.

 

What advice do you have for researchers trying to make research useful and accessible to practitioners? 

Be collaborative and authentic. Demonstrate that you are truly there to create meaningful and important changes that will benefit students. Show that your priority is improving outcomes for schools and students and not simply collecting data for a study. These actions are vital to building trust in a partnership. By sharing the process of reviewing data, researchers can show how the research is directly impacting schools, and practitioners have an opportunity to share how their experience relates to the data. A good way to do this is by presenting with practitioners at conferences or collaboratively writing manuscripts for peer reviewed journals. For example, we wrote a manuscript (currently under review) with one of our school counselor partners describing how he used EIS data in practice. Through collaboration like this, we find that the purpose and process of research becomes less mysterious, and schools can more easily identify and use practices that are shown to work. In this way, long-term collaboration between partners can ultimately benefit students!

 

How does the work of the original partnership inform your current work with the National Center for Rural School Mental Health? 

We are bringing what we have learned both in how to be effective partners and to improve the model to the National Center for Rural School Mental Health. For instance, we are developing an intervention hub on our Rural Center website that will allow schools to directly link evidence-based interventions to the data. We learned that having readily available ideas for intervening using the data is an important aspect of success. We have also learned that schools with problem solving teams can implement the model with higher fidelity, so we are developing training modules for schools to learn how to use the data in problem solving teams. We will be taking the comprehensive model to prevent and intervene with youth mental health and using it in rural schools. We will continue to partner with our rural schools to continuously improve the work so that it is feasible, socially valid, and important to rural schools and youth in those schools.


 

Dr. Wendy Reinke is an Associate Vice Chancellor for Research at the University of Missouri College of Education. Her research focuses on school-based prevention interventions for children and youth with social emotional and behavioral challenges.

Written by Bennett Lunn (Bennett.lunn@ed.gov), Truman-Albright Fellow, National Center for Education Research and National Center for Special Education Research