Inside IES Research

Notes from NCER & NCSER

Student-Led Action Research as a School Climate Intervention and Core Content Pedagogy

Improving the social and emotional climate of schools has become a growing priority for educators and policymakers in the past decade. The prevailing strategies for improving school climate include social and emotional learning, positive behavioral supports, and trauma-informed approaches. Many of these strategies foreground the importance of students having a voice in intervention, as students are special experts in their own social and emotional milieus.

Parallel to this trend has been a push toward student-centered pedagogical approaches in high schools that are responsive to cultural backgrounds and that promote skills aligned with the demands of the modern workplace, like critical thinking, problem-solving, and collaboration. Culturally responsive and restorative teaching and problem- and project-based learning are prominent movements. In this guest blog, Dr. Adam Voight at Cleveland State University discusses an ongoing IES-funded Development and Innovation project taking place in Cleveland, Ohio that aims to develop and document the feasibility of a school-based youth participatory action research intervention.

 

Our project is exploring how youth participatory action research (YPAR) may help to realize two objectives—school climate improvement and culturally-restorative, engaged learning. YPAR involves young people leading a cycle of problem identification, data collection and analysis, and evidence-informed action. It has long been used in out-of-school and extracurricular spaces to promote youth development and effect social change. We are field testing its potential to fit within more formal school spaces.

Project HighKEY

The engine for our project, which we call Project HighKEY (High-school Knowledge and Education through YPAR), is a design team composed of high school teachers and students, district officials, and university researchers. It is built from the Cleveland Alliance for Education Research, a research-practice partnership between the Cleveland Metropolitan School District, Cleveland State University, and the American Institutes for Research. The design team meets monthly to discuss YPAR theory and fit with high school curriculum and standards and make plans for YPAR field tests in schools. We have created a crosswalk of the documented competencies that students derive from YPAR and high school standards in English language arts (ELA), mathematics, science, and social studies in Ohio. For example, one state ELA standard is “Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence,” and through YPAR students collect and analyze survey and interview data and use their findings to advocate for change related to their chosen topic. A state math standard is “Interpret the slope and the intercept of a linear model in the context of data,” and this process may be applied to survey data students collect through YPAR, making an otherwise abstract activity more meaningful to students.  

Assessing the Effectiveness of YPAR

Remaining open-minded about the various ways in which YPAR may or may not fit in different high school courses, we are currently testing its implementation in a pre-calculus course, a government course, an English course, and a life-skills course. For example, a math teacher on our design team has built her statistics unit around YPAR. Students in three separate sections of the course have worked in groups of two or three to identify an issue and create a survey that is being administered to the broader student body. These issues include the lack of extracurricular activities, poor school culture, and unhealthy breakfast and lunch options. Their survey data will be used as the basis for learning about representing data with plots, distributions, measures of center, frequencies, and correlation after the winter holiday. Our theory is that students will be more engaged when using their own data on topics of their choosing and toward the goal of making real change. Across all of our project schools, we are monitoring administrative data, student and teacher survey data, and interview data to assess the feasibility, usability, and student and school outcomes of YPAR.

Impact of COVID-19 and How We Adapted

We received notification of our grant award in March 2020, the same week that COVID-19 shut down K-12 schools across the nation. When our project formally began in July 2020, our partner schools were planning for a wholly remote school year, and we pivoted to hold design team meetings virtually and loosen expectations for teacher implementation. Despite these challenges, several successful YPAR projects during that first year—all of which were conducted entirely remotely—taught all of us much about how YPAR can happen in online spaces. This school year, students and staff are back to in-person learning, but, in addition to the ongoing pandemic, the crushing teacher shortage has forced us to continue to adapt. Whereas we once planned our design team meeting during the school day, we now meet after school due to a lack of substitute teachers, and we use creative technology to allow for mixed virtual and in-person attendance. Our leadership team is also spending a great deal of time in classrooms with teachers to assist those implementing for the first time. Our goal is to create a resource that teachers anywhere can use to incorporate YPAR into their courses. The product will be strengthened by the lessons we have learned from doing this work during these extraordinary times and the resulting considerations for how to deal with obstacles to implementation.


Adam Voight is the Director of the Center for Urban Education at Cleveland State University.

For questions about this grant, please contact Corinne Alfeld, NCER Program Officer, at Corinne.Alfeld@ed.gov.

Working to Understand the Policy Process in the Development of Michigan’s Read by Grade Three Law

In recent decades, there has been an emphasis on quantitative, causal research in education policy. These methods are best suited for answering questions about the effects of a policy and whether it achieved its intended outcomes. While the question of, “Did it work?” remains critical, there is a need for research that also asks, “Why did it work? For whom? In what contexts?” To answer these types of questions, researchers must incorporate rigorous qualitative methods into their quantitative studies. Education research organizations like the Association for Education Finance and Policy have explicitly invited proposals using qualitative and mixed methodologies in an effort to elevate research addressing a range of critical education policy questions. Funding organizations, including IES, encourage applicants to incorporate qualitative methods into their research process. In this guest blog, Amy Cummings, Craig De Voto, and Katharine Strunk discuss how they are using qualitative methods in their evaluation of a state education policy.

 

In our IES-funded study, we use qualitative, survey, and administrative data to understand the implementation and impact of Michigan’s early literacy law—the Read by Grade Three Law. Like policies passed in 19 other states, the Read by Grade Three Law aims to improve K-3 student literacy skills and mandates retention for those who do not meet a predetermined benchmark on the state’s third-grade English language arts assessment. Although the retention component of these policies remain controversial, similar laws are under consideration in several other states, including Alaska, Kentucky, and New Mexico. Below are some of the ways that we have integrated qualitative methods in our evaluation study to better understand the policy process in the development of the Read by Grade Three Law.  

Collecting qualitative sources helped us understand how the policy came to be, thereby assisting in the structure of our data collection for examining the law’s implementation and subsequent effects. In our first working paper stemming from this study, we interviewed 24 state-level stakeholders (policymakers, state department of education officials, early literacy leaders) involved in the development of the law and coded state policy documents related to early literacy to assess the similarity between Michigan’s policy and those of other states. Understanding the various components of the Law and how they ended up in the policy led us to ensure that we asked educators about their perceptions and implementation of these components in surveys that are also part of our evaluation. For example, because our interviews made clear the extent to which the inclusion of the retention component of the Law was controversial during its development, we included questions in the survey to assess educators’ perceptions and intended implementation of this component of the Law. In addition, it confirmed the importance of our plan to use state administrative retention and assessment data to evaluate the effect of retention on student literacy outcomes.

To trace the Read by Grade Three Law’s conception, development, and passage, we analyzed these qualitative data using two theories of the policy process: Multiple Streams Framework (MSF) and policy transfer. MSF says that policy issues emerge on government agendas through three streams: problem, policy, and political. When these streams join, a policy window is opened during which there is a greater opportunity for passing legislation. Meanwhile, policy transfer highlights how policies enacted in one place are often used in the development of policies in another.

We found that events in the problem and political streams created conditions ripe for the passage of an early literacy policy in Michigan:

  • A national sentiment around improving early literacy, including a retention-based third-grade literacy policy model that had been deemed successful in Florida
  • A pressing problem took shape, as evidenced by the state’s consistently below average fourth-grade reading scores on the National Assessment of Educational Progress
  • A court case addressing persistently low-test scores in a Detroit-area district
  • Previous attempts by the state to improve early literacy

As a result of these events, policy entrepreneurs—those willing to invest resources to get their preferred policy passed—took advantage of political conditions in the state and worked with policymakers to advance a retention-based third-grade literacy policy model. The figure below illustrates interviewee accounts of the Read by Grade Three Law’s development. Our policy document analysis further reveals that Michigan’s and Florida’s policies are very similar, only diverging on nine of the 50 elements on which we coded.

 

 

Although this study focuses on the development and passage of Michigan’s early literacy law, our findings highlight both practical and theoretical elements of the policy process that can be useful to researchers and policymakers. To this end, we show how particular conditions, coupled by policy entrepreneurs, spurred Michigan’s consideration of such a policy. It is conceivable that many state education policies beyond early literacy have taken shape under similar circumstances: a national sentiment combined with influential brokers outside government. In this way, our mixed-methods study provides a practical model of what elements might manifest to enact policy change more broadly.

From a theoretical standpoint, this research also extends our understanding of the policy process by showing that MSF and the theory of policy transfer can work together. We learned that policy entrepreneurs can play a vital role in transferring policy from one place to another by capitalizing on conditions in a target location and coming with a specific policy proposal at the ready.

There is, of course, more to be learned about the intersection between different theories of the policy process, as well as how external organizations as opposed to individuals operate as policy entrepreneurs. As the number of education advocacy organizations continues to grow and these groups become increasingly active in shaping policy, this will be an exciting avenue for researchers to continue to explore.

This study is just one example of how qualitative research can be used in education policy research and shows how engaging in such work can be both practically and theoretically valuable. The most comprehensive evaluations will use different methodologies in concert with one another to understand education policies, because ultimately, how policies are conceptualized and developed has important implications for their effectiveness.


Amy Cummings is an education policy PhD student and graduate research assistant at the Education Policy Innovation Collaborative (EPIC) at Michigan State University (MSU).

Craig De Voto is a visiting research assistant professor in the Learning Sciences Research Institute at the University of Illinois at Chicago and an EPIC affiliated researcher.

Katharine O. Strunk is the faculty director of EPIC, the Clifford E. Erickson Distinguished Chair in Education, and a professor of education policy and by courtesy economics at MSU.

How Remote Data Collection Enhanced One Grantee’s Classroom Research During COVID-19

Under an IES grant, Michigan State University, in collaboration with the Michigan Department of Education, the Michigan Center for Educational Performance and Information, and the University of Michigan, is assessing the implementation, impact, and cost of the Michigan “Read by Grade 3” law intended to increase early literacy outcomes for Michigan students. In this guest blog, Dr. Tanya Wright and Lori Bruner discuss how they were able to quickly pivot to a remote data collection plan when COVID-19 disrupted their initial research plan.  

The COVID-19 pandemic began while we were planning a study of early literacy coaching for the 2020-2021 academic year. It soon became abundantly clear that restrictions to in-person research would pose a major hurdle for our research team. We had planned to enter classrooms and record videos of literacy instruction in the fall. As such, we found ourselves faced with a difficult choice: we could pause our study until it became safer to visit classrooms and miss the opportunity to learn about literacy coaching and in-person classroom instruction during the pandemic, or we could quickly pivot to a remote data collection plan.

Our team chose the second option. We found that there are multiple technologies available to carry out remote data collection. We chose one of them (a device known as the Swivl) that included a robotic mount, where a tablet or smartphone can be placed to take the video, with a 360-degree rotating platform that works in tandem with a handheld or wearable tracker and an app that allows videos to be instantly uploaded to a cloud-based storage system for easy access.

Over the course of the school year, we captured over 100 hours of elementary literacy instruction in 26 classrooms throughout our state. While remote data collection looks and feels very different from visiting a classroom to record video, we learned that it offers many benefits to both researchers and educators alike. We also learned a few important lessons along the way.

First, we learned remote data collection provides greater flexibility for both researchers and educators. In our original study design, we planned to hire data collectors to visit classrooms, which restricted our recruitment of schools to a reasonable driving distance from Michigan State University (MSU). However, recording devices allow us to capture video anywhere, including rural areas of our state that are often excluded from classroom research due to their remote location. Furthermore, we found that the cost of purchasing and shipping equipment to schools is significantly less than paying for travel and people’s time to visit classrooms. In addition, using devices in place of data collectors allowed us to easily adapt to last-minute schedule changes and offer teachers the option to record video over multiple days to accommodate shifts in instruction due to COVID-19.

Second, we discovered that we could capture more classroom talk than when using a typical video camera. After some trial and error, we settled on a device with three external wireless microphones: one for the teacher and two additional microphones to place around the classroom. Not only did the extra microphones record audio beyond what the teacher was saying, but we learned that we can also isolate each microphone during data analysis to hear what is happening in specific areas of the classroom (even when the teacher and children were wearing masks). We also purchased an additional wide-angle lens, which clipped over the camera on our tablet and allowed us to capture a wider video angle.  

Third, we found remote data collection to be less intrusive than sending a research team into schools. The device is compact and can be placed on any flat surface in the classroom or be mounted on a basic tripod. The teacher has the option to wear the microphone on a lanyard to serve as a hands-free tracker that signals the device to rotate to follow the teacher’s movements automatically. At the end of the lesson, the video uploads to a password-protected storage cloud with one touch of a button, making it easy for teachers to share videos with our research team. We then download the videos to the MSU server and delete them from our cloud account. This set-up allowed us to collect data with minimal disruption, especially when compared to sending a person with a video camera to spend time in the classroom.

As with most remote work this year, we ran into a few unexpected hurdles during our first round of data collection. After gathering feedback from teachers and members of our research team, we were able to make adjustments that led to a better experience during the second round of data collection this spring. We hope the following suggestions might help others who are considering such a device to collect classroom data in the future:

  1. Consider providing teachers with a brief informational video or offering after-school training sessions to help answer questions and address concerns ahead of your data collection period. We initially provided teachers with a detailed user guide, but we found that the extra support was key to ensuring teachers had a positive experience with the device. You might also consider appointing a member of your research team to serve as a contact person to answer questions about the remote data collection during data collection periods.
  2. As a research team, it is important to remember that team members will not be collecting the data, so it is critical to provide teachers with clear directions ahead of time: what exactly do you want them to record? Our team found it helpful to send teachers a brief two-minute video outlining our goals and then follow up with a printable checklist they could use on the day they recorded instruction. 
  3. Finally, we found it beneficial to scan the videos for content at the end of each day. By doing so, we were able to spot a few problems, such as missing audio or a device that stopped rotating during a lesson. While these instances were rare, it was helpful to catch them right away, while teachers still had the device in their schools so that they could record missing parts the next day.

Although restrictions to in-person research are beginning to lift, we plan to continue using remote data collection for the remaining three years of our project. Conducting classroom research during the COVID-19 pandemic has proven challenging at every turn, but as we adapted to remote video data collection, we were pleased to find unanticipated benefits for our research team and for our study participants.


This blog is part of a series focusing on conducting education research during COVID-19. For other blog posts related to this topic, please see here.

Tanya S. Wright is an Associate Professor of Language and Literacy in the Department of Teacher Education at Michigan State University.

Lori Bruner is a doctoral candidate in the Curriculum, Instruction, and Teacher Education program at Michigan State University.

Cost Analysis in Practice: Resources for Cost Analysis Studies

IES supports rigorous research that can provide scientific evidence on how best to address our nation’s most pressing education needs. As part of the Standards for Excellence in Education Research (SEER) principles, IES-funded researchers are encouraged, and in some cases required, to conduct a cost analysis for their projects with the intended goal of supporting education agencies’ decision-making around the adoption of programs, policies, or practices. 

 

The Cost Analysis in Practice (CAP) Project is a 3-year initiative funded by IES to support researchers and practitioners who are planning or conducting a cost analysis of educational programs and practices. This support includes the following freely available resources.

  • Resources developed by the CAP Project
    • Introductory resources on cost analysis including Standards and Guidelines 1.1, an infographic, a video lecture, and FAQs.
    • Tools for planning your cost analysis, collecting and analyzing cost data, and reporting your results.
    • A Help Desk for you to submit inquiries about conducting a cost analysis with a response from a member of the CAP Project Team within two business days.
  • Other resources recommended by the CAP Project
    • Background materials on cost analysis
    • Guidance on carrying out a cost analysis
    • Standards for the Economic Evaluation of Educational and Social Programs
    • Cost analysis software

 

The CAP Project is also involved in longer-term collaborations with IES-funded evaluation projects to better understand their cost analysis needs. As part of this work, the CAP Project will be producing a set of three blogs to discuss practical details regarding cost studies based on its collaboration with a replication project evaluating an intervention that integrates literacy instruction into the teaching of American history. These blogs will discuss the following:

  • Common cost analysis challenges that researchers encounter and recommendations to address them
  • The development of a timeline resource for planning a cost study
  • Data collection for a cost study

 

The CAP Project is interested in your feedback on any of the CAP Project resources and welcomes suggestions for additional resources to support cost analysis. If you have any feedback, please fill out a suggestion form at the bottom of the Resources web page.

Partnering with Practitioners to Address Mental Health in Rural Communities

IES values and encourage collaborations between researchers and practitioners to ensure that research findings are relevant, accessible, feasible, and useful. In 2017, Dr. Wendy Reinke, University of Missouri, received IES funding to formalize the Boone County School’s Mental Health Coalition by strengthening their partnership and validating the Early Identification System (EIS) to screen for social, emotional, behavioral, and academic risk among K-12 students in rural schools. Building on these successes, Dr. Reinke now leads the National Center for Rural School Mental Health (NCRSMH), a consortium of researchers leading efforts to advance mental health screening and support in rural communities.

Bennett Lunn, a Truman-Albright Fellow at IES, asked Dr. Reinke about the work of the original partnership and how it has informed her efforts to build new partnerships with other rural schools around the country. Below are her responses.

 

What was the purpose of the Boone County Schools Mental Health Coalition and what inspired you to do this work?

In 2015, our county passed an ordinance in which a small percent of our sales tax is set aside to support youth mental health in our community. As a result, the schools had visits from many of the local mental health agencies to set up services in school buildings. The superintendents quickly realized that it would be wise to have a more coordinated effort across school districts. They formed a coalition and partnered with researchers at the University of Missouri to develop a comprehensive model to prevent and intervene in youth mental health problems. The enthusiasm of our school partners and their willingness to consider research evidence to inform the model was so energizing! We were able to build a multi-tiered prevention and intervention framework that uses universal screening data to inform supports. In addition, we were awarded an IES partnership grant to help validate the screener, conduct focus groups and surveys of stakeholders to understand the feasibility and social validity of the model, and determine how fidelity to the model is related to student outcomes. The EIS is now being used in 54 school buildings across six school districts as part of their daily practice. 

 

Were there advantages to operating in a partnership to validate the screener?  

The main benefit of working in partnership with school personnel is that you learn what works under what circumstances from those directly involved in supporting students. We meet every month with the superintendents and other school personnel to ensure that if things are not working, we can find solutions before the problems become too big. We vote on any processes or procedure that were seen as needing to change. The meeting includes school personnel sharing the types of activities they are doing in their buildings so that others can replicate those best practices, and we meet with students to get their perspectives on what is working. In addition, the university faculty bring calls for external funding of research to the group to get ideas for what types of research would be appropriate and beneficial to the group. Schools are constantly changing and encountering new challenges. Being close to those who are working in the buildings allows for us to work together in forming and implementing feasible solutions over time.

 

What advice do you have for researchers trying to make research useful and accessible to practitioners? 

Be collaborative and authentic. Demonstrate that you are truly there to create meaningful and important changes that will benefit students. Show that your priority is improving outcomes for schools and students and not simply collecting data for a study. These actions are vital to building trust in a partnership. By sharing the process of reviewing data, researchers can show how the research is directly impacting schools, and practitioners have an opportunity to share how their experience relates to the data. A good way to do this is by presenting with practitioners at conferences or collaboratively writing manuscripts for peer reviewed journals. For example, we wrote a manuscript (currently under review) with one of our school counselor partners describing how he used EIS data in practice. Through collaboration like this, we find that the purpose and process of research becomes less mysterious, and schools can more easily identify and use practices that are shown to work. In this way, long-term collaboration between partners can ultimately benefit students!

 

How does the work of the original partnership inform your current work with the National Center for Rural School Mental Health? 

We are bringing what we have learned both in how to be effective partners and to improve the model to the National Center for Rural School Mental Health. For instance, we are developing an intervention hub on our Rural Center website that will allow schools to directly link evidence-based interventions to the data. We learned that having readily available ideas for intervening using the data is an important aspect of success. We have also learned that schools with problem solving teams can implement the model with higher fidelity, so we are developing training modules for schools to learn how to use the data in problem solving teams. We will be taking the comprehensive model to prevent and intervene with youth mental health and using it in rural schools. We will continue to partner with our rural schools to continuously improve the work so that it is feasible, socially valid, and important to rural schools and youth in those schools.


 

Dr. Wendy Reinke is an Associate Vice Chancellor for Research at the University of Missouri College of Education. Her research focuses on school-based prevention interventions for children and youth with social emotional and behavioral challenges.

Written by Bennett Lunn (Bennett.lunn@ed.gov), Truman-Albright Fellow, National Center for Education Research and National Center for Special Education Research