IES Blog

Institute of Education Sciences

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Have a Cost Analysis to Plan or Execute? We Have a Module for That

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team.

Analyzing an intervention’s costs is one of IES’s nine SEER principles. Cost analysis is not just about the dollar value of an intervention; it provides key information to education decision-makers about the personnel, materials, facilities, and other inputs needed to implement an intervention or policy with fidelity. But planning and executing any kind of economic evaluation, such as a cost analysis or cost-effectiveness analysis, involves many steps.

The IES-funded Cost Analysis in Practice Project (CAP Project) has developed a series of five free, online modules on cost analysis. Each module includes a sequence of short videos (3‑17 minutes each) and resources to facilitate each of the 4 main stages of a cost analysis: study design, data collection, data analysis, and reporting (register here for the CAP Project online modules).

The modules are timely for anyone submitting a grant application to the IES FY 2024 grant programs that require a cost analysis. In addition, cost studies are included in the Education Innovation and Research (EIR) Mid-phase or Expansion grants. For your grant application, you’ll likely only need parts of Modules 1 and 2, Introduction to Cost Analysis and Designing a Cost Analysis. You can save the rest for when you receive a grant.

You should review the IES Request for Applications (RFA) to determine what kind of economic evaluation, if any, is required for your IES application. You can also review the CAP Project’s RFA requirements chart, which summarizes our take on what is required and what is recommended for each IES RFA. If your grant application does not require a cost analysis but you want to include one, we created a flowchart to help you decide which type of evaluation might make sense for your situation: see Module 1 Video 2b. We also provide a brief example of each kind of economic evaluation in Module 1 Video 3. 

If cost analysis is new to you, Module 1 Video 1 explains what “costs” really are. Module 1 Video 2a introduces the ingredients method and a demonstration of why it’s important to differentiate between economic costs and expenditures. Module 1 Video 4 walks you through the four stages of a cost analysis and points out when to use specific CAP Project resources such as our Checklist for Cost Analysis Plans, Timeline of Activities for Cost Analysis, and Cost Analysis Templates (the “CAPCATs”). If you prefer reading to watching videos, our Cost Analysis Standards & Guidelines cover this ground in more depth.

When you’re ready to plan your cost or cost-effectiveness analysis, head to Module 2. The introductory video (Module 2 Video 1) discusses a few critical decisions you need to make early on that will affect how much of your study budget should be dedicated to the economic evaluation—no one likes surprises there. Module 2 Videos 2 and 3 walk you through the design of an economic evaluation, illustrating each design feature using Reading Recovery as an example. Module 2 Video 4 presents a few scenarios to help you think about which costs you will estimate and how the costs of the intervention you plan to study compare to the costs of business as usual. Module 2 Video 5 reviews a timeline and key activities for each stage of your economic evaluation. The content in Modules 1 and 2 should help you develop a robust plan for an economic evaluation so that you’ll be all set to begin the study as soon as you are funded.

Modules 3-5 cover data collection, analysis, and reporting. You may want to skim these now, or at least watch the brief introductory videos for an overview of what’s in store for you and your cost analyst. These modules can help you execute your cost study.


Fiona Hollands is the Founder & Managing Director of EdResearcher. She studies the effectiveness and costs of educational programs with the goal of helping education practitioners and policymakers optimize the use of resources in education to promote better student outcomes.

Jaunelle Pratt-Williams is a Senior Research Scientist at NORC at the University of Chicago. She leads economic evaluations and mixed-methods policy research studies to improve the educational opportunities for historically underserved students.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

New Standards to Advance Equity in Education Research

One year ago, IES introduced a new equity standard and associated recommendations to its Standards for Excellence in Education Research (SEER). The intent of this standard, as well as the other eight SEER standards, is to complement IES’s focus on rigorous evidence building with guidance and supports for practices that have the potential to make research transformational. The addition of equity to SEER is part of IES’s ongoing mission to improve academic achievement and access to educational opportunities for all learners (see IES Diversity Statement). IES is mindful, however, that to authentically and rigorously integrate equity into research, education researchers may need additional resources and tools. To that end, IES hosted a Technical Working Group (TWG) meeting of experts to gather input for IES’s consideration regarding the existing tools and resources that the education community could use as they implement the new SEER equity standard in their research, along with identifying any notable gaps where tools and resources are needed. A summary of the TWG panel discussion and recommendations is now available.

The TWG panel recommended several relevant resources and provided concrete suggestions for ways IES can support education researchers’ learning and growth, including training centers, coaching sessions, webinars, checklists, and new resource development, acknowledging that different researchers may need different kinds of supports. The meeting summary includes both a mix of recommendations for tools and resources, along with important considerations for researchers, including recommendations for best practices, as they try to embed equity in their research. 

The new SEER equity standard and accompanying recommendations have been integrated throughout the current FY 2024 Request for Applications. By underscoring the importance of equity, the research IES supports will both be rigorous and relevant to address the needs of all learners.   


This blog was written by NCER program officer Christina Chhin. If you have questions or feedback regarding the equity TWG, please contact Christina Chhin (Christina.Chhin@ed.gov) or Katina Stapleton (Katina.Stapleton@ed.gov), co-chair of the IES Diversity Council. If you have any questions or feedback regarding the equity standard or associated recommendations, please email NCEE.Feedback@ed.gov.

Encouraging the Use of LGBTQI+ Education Research Data

Until recently, limited data existed in education research focused on the LGBTQI+ community and their experiences. As this area of interest continues to grow, education researchers are learning how to effectively collect these data, interpret their implications, and use them to help improve the educational outcomes of LGBTQI+ identifying students. In this blog post, we review current federal recommendations for data collection and encourage researchers to submit FY 2024 applications focused on the educational experiences and outcomes of Lesbian, Gay, Bisexual, Transgender, Queer, and Intersex (LGBTQI+) identifying students.

Collecting Data on Sexual Orientation and Gender Identities

In January 2023, the Office of the Chief Statistician of the United States released a report with recommendations on how to effectively design federal statistics surveys to account for sexual orientation and gender identities (SOGI). While this report is for a federal audience, the recommendations are relevant and useful for education researchers who wish to measure the identities and experiences of those in the LGBTQI+ community. Some suggestions include—

  • Provide multiple options for sexual orientation identification (for example, gay/lesbian, straight, bisexual, use other term)
  • Provide a two-question set in order to measure gender identity—one asking for sex assigned at birth, and one for current self-identification
  • Provide write-in response and multiple-response options for SOGI-related questions
  • Allow respondents to proceed through the survey if they choose not to answer unless answers to any of these items are critical for data collection

Education researchers looking to incorporate SOGI data into their studies can also use existing SOGI data collected by the National Center for Education Statistics (NCES) to support their research. A new NCES blog outlines the studies that collect SOGI information and outlines some initial findings from that data.

Funding Opportunities for Research to Improve Outcomes of LGBTQI+ students

In alignment with the SEER Equity Standard, IES encourages researchers to submit applications to the FY 2024 research grant competitions that support the academic and social behavioral outcomes of students who identify as LGBTQI+. IES is especially interested in research proposals that involve—

  • Describing the educational experiences and outcomes of LGBTQI+ students
  • Creating safe and inclusive learning environments that support the needs of all LGBTQI+ students.
  • Identifying promising practices for school-based health services and supports, especially mental health services, that are accessible to and supportive of LGBTQI+ students
  • Identifying systems-level approaches that reduce barriers to accessing and participating in high quality learning environments for LGBTQI+ students

Check out our funding opportunities page for more information about our FY 2024 requests for applications. If you have specific questions about the appropriateness of your research for a specific FY 2024 research competition, please contact the relevant program officer listed in the request for applications.


This blog is part of a 3-part Inside IES Research blog series on sexual orientation and gender identity in education research in observance of Pride month. The other posts discuss the feedback from the IES LGBTQI+ Listening and Learning session and the first ever learning game featuring a canonically nonbinary character.

This blog was produced by Virtual Student Federal Service intern Audrey Im with feedback from IES program officers Katina Stapleton (NCER - Katina.Stapleton@ed.gov) and Katherine Taylor (NCSER - Katherine.Taylor@ed.gov) and NCES project officers Elise Christopher (Elise.Christopher@ed.gov) and Maura Spiegelman (Maura.Spiegelman@ed.gov).