Researchers must possess the ability to clearly communicate research findings to non-technical audiences, including decision makers who may have limited time and varying levels of tolerance for data-rich reports. We and our colleagues recently honed these skills while preparing research briefs for the Virginia Department of Education (VDOE) as part of an IES-funded partnership project between VDOE and the University of Virginia exploring the impacts of the COVID-19 pandemic on students and teachers. These briefs are a key mechanism through which our project services the purpose of IES’s Using Longitudinal Data to Support State Education Policymaking Grantmaking Programs to generate useful findings to inform the decision making of policy makers and education leaders at the state, district, and school levels.
In their initial feedback, VDOE described the briefs as “too technical.” When we led with the numbers, our intended audience quickly became overwhelmed by the need to also interpret the findings on their own. Our conversations with VDOE provided helpful direction on how we could revise the briefs to better reach non-technical, decision-making audiences in Virginia and beyond. We share six strategies we have applied to all our research briefs.
- Yes, briefs need a summary too: The draft briefs were short (4-7 pages) inclusive of figures and endnotes, and they began with a list of key findings. Based on the feedback, we morphed this list into a proper summary of the brief. Many of the decision makers we want to reach only have time to read a page summary, and that summary needs to be self-contained. Without additional context, the initial list of key findings would have had minimal impact.
- Lead with the headline: Numbers are a powerful tool for storytelling; however, too many numbers can also be hard for many people—researchers and non-researchers alike—to consume. We therefore edited each paragraph to lead with a numbers-free sentence that provides the main take away from the analysis and followed that up with the supporting evidence (the numbers).
- Answer the question: Our initial groundwork to develop solid relationships with agency staff allowed us to identify priority questions on which to focus the briefs. While several tangential but interesting findings also resulted from our analysis, the briefs we developed only focused on answering the priority research questions. Tangential findings can be explored in more depth in future research projects.
- Accurate but not over-caveated: All research makes some assumptions and has some limitations. The average non-technical audience member is unlikely to want a thorough detailing of each of these; however, some are too important to exclude. We chose to include those that were most vital to helping the reader make the correct interpretation.
- A picture speaks a thousand words: This was something at which our initial drafts succeeded. Rather than providing tables of statistics, we included simple, well-labeled figures that clearly presented the key findings graphically to visually tell the story.
- Conclude by summarizing not extrapolating: The purpose of these briefs was to describe the changes that the pandemic wrought to Virginia’s public schools and convey that knowledge to decision makers charged with plotting a course forward. The briefs were not intended to provide explicit guidance or recommendations to those decision makers.
These strategies, of course, are also useful when writing for technical audiences. While their training and experiences may equip them to consume research that doesn’t exhibit these six strategies, using these strategies will enhance the impact of your research findings with even the most technical of audiences.
Luke C. Miller is a Research Associate Professor at the University of Virginia’s School of Education and Human Development. He is the lead researcher and co-Principal Investigator on the IES-funded project led by VDOE in partnership with UVA.
Jennifer Piver-Renna is the Director of the Office of Research in the Department of Data, Research and Technology at the Virginia Department of Education. She is the state education agency (SEA) co-Investigator on the IES-funded project.
This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner for Policy and Systems Division, NCER.