IES Blog

Institute of Education Sciences

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Have a Cost Analysis to Plan or Execute? We Have a Module for That

This blog is part of a guest series by the Cost Analysis in Practice (CAP) project team.

Analyzing an intervention’s costs is one of IES’s nine SEER principles. Cost analysis is not just about the dollar value of an intervention; it provides key information to education decision-makers about the personnel, materials, facilities, and other inputs needed to implement an intervention or policy with fidelity. But planning and executing any kind of economic evaluation, such as a cost analysis or cost-effectiveness analysis, involves many steps.

The IES-funded Cost Analysis in Practice Project (CAP Project) has developed a series of five free, online modules on cost analysis. Each module includes a sequence of short videos (3‑17 minutes each) and resources to facilitate each of the 4 main stages of a cost analysis: study design, data collection, data analysis, and reporting (register here for the CAP Project online modules).

The modules are timely for anyone submitting a grant application to the IES FY 2024 grant programs that require a cost analysis. In addition, cost studies are included in the Education Innovation and Research (EIR) Mid-phase or Expansion grants. For your grant application, you’ll likely only need parts of Modules 1 and 2, Introduction to Cost Analysis and Designing a Cost Analysis. You can save the rest for when you receive a grant.

You should review the IES Request for Applications (RFA) to determine what kind of economic evaluation, if any, is required for your IES application. You can also review the CAP Project’s RFA requirements chart, which summarizes our take on what is required and what is recommended for each IES RFA. If your grant application does not require a cost analysis but you want to include one, we created a flowchart to help you decide which type of evaluation might make sense for your situation: see Module 1 Video 2b. We also provide a brief example of each kind of economic evaluation in Module 1 Video 3. 

If cost analysis is new to you, Module 1 Video 1 explains what “costs” really are. Module 1 Video 2a introduces the ingredients method and a demonstration of why it’s important to differentiate between economic costs and expenditures. Module 1 Video 4 walks you through the four stages of a cost analysis and points out when to use specific CAP Project resources such as our Checklist for Cost Analysis Plans, Timeline of Activities for Cost Analysis, and Cost Analysis Templates (the “CAPCATs”). If you prefer reading to watching videos, our Cost Analysis Standards & Guidelines cover this ground in more depth.

When you’re ready to plan your cost or cost-effectiveness analysis, head to Module 2. The introductory video (Module 2 Video 1) discusses a few critical decisions you need to make early on that will affect how much of your study budget should be dedicated to the economic evaluation—no one likes surprises there. Module 2 Videos 2 and 3 walk you through the design of an economic evaluation, illustrating each design feature using Reading Recovery as an example. Module 2 Video 4 presents a few scenarios to help you think about which costs you will estimate and how the costs of the intervention you plan to study compare to the costs of business as usual. Module 2 Video 5 reviews a timeline and key activities for each stage of your economic evaluation. The content in Modules 1 and 2 should help you develop a robust plan for an economic evaluation so that you’ll be all set to begin the study as soon as you are funded.

Modules 3-5 cover data collection, analysis, and reporting. You may want to skim these now, or at least watch the brief introductory videos for an overview of what’s in store for you and your cost analyst. These modules can help you execute your cost study.


Fiona Hollands is the Founder & Managing Director of EdResearcher. She studies the effectiveness and costs of educational programs with the goal of helping education practitioners and policymakers optimize the use of resources in education to promote better student outcomes.

Jaunelle Pratt-Williams is a Senior Research Scientist at NORC at the University of Chicago. She leads economic evaluations and mixed-methods policy research studies to improve the educational opportunities for historically underserved students.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

Letter from the Acting NCSER Commissioner: Providing Clarity on NCSER Fiscal Year 2023 Funding and Fiscal Year 2024 Competitions

The IES director recently posted a blog indicating that IES had to return approximately $44 million in unobligated funds of the $100 million total American Rescue Plan (ARP) funding IES received to help the nation's students recover from the learning losses of the pandemic. NCSER was hard hit by the rescission of these funds.

As transparent as we try to be, admittedly, the federal budgeting process is not always clear. Many of you have reached out with concerns about the potential impact of these ARP rescissions on your current grants and future funding opportunities. Please allow me to explain the current context of fiscal year 2023 funding and forecast for fiscal year 2024.

NCSER's Grant Funding: Where the Money Comes From and How It Is Spent

NCSER funds come from the Research in Special Education (RiSE) appropriation, which is one small part of the larger IES appropriations account. RiSE supports all of NCSER’s typical grant competitions. We also contribute money from this account to our share of other IES activities such as the grant peer review process and the PI meeting.

As those of you who have been funded by NSCER know, we provide grant funding on an annual basis. Even though we fund projects annually, once we make an award, we are committed to providing annual costs for a continuing project through the duration of the designated study period. Consequently, the amount of money available to support new research and training awards each year is contingent, in part, upon the number of current awards and their outyear costs. Any time NCSER funds a high number of new awards (and thereby commits to funding every award through the duration of the designated study period), there will be less money available for new awards the following year, unless the RiSE program appropriation receives an increase from Congress.

Deciding what new grant competitions in NCSER might look like in any given year requires that we balance many factors, including: (1) the amount of funding Congress is likely to appropriate to RiSE (note that we typically have to make decisions before we know for sure how much money we will have), (2) projected continuation costs for existing awards and commitments, (3) estimates of total funding available for new awards based on 1 and 2 above, and (4) a best guess prediction of the percent of applicants that will be successful, based on trends over time, in any single NCSER sponsored competition. If you have been around NCSER long enough, you know our funding is typically very tight, sometimes so tight we can’t offer any competitions (FY 2014) or need to significantly limit available competitions (FY 2017).

NCSER’s ARP-funded Research Projects

In FY 2022, once again, we found ourselves with insufficient funds to hold our typical special education research grant competitions. At the same time, IES received $100 million in ARP funding. With NCSER’s share of those funds, we chose, in part, to hold a new Research to Accelerate Pandemic Recovery in Special Education grants program to fund projects that addressed the urgent challenges faced by districts and schools in supporting learners with or at risk for disabilities, their teachers, and their families in the aftermath of the pandemic. The competition was funded solely using ARP funds. To be clear, RiSE funds were never intended to be a source of funding for these projects. 

By now you may be predicting where this blog is going…

NCSER was thrilled to be able to fund 9 research grants through this ARP-funded competition, all of which have the potential to improve outcomes significantly and rapidly for students with or at risk for disabilities. A little less than 2 months ago, NCSER was in the process of documenting annual progress and approving continuation funding for these grantees when the ARP funds were unexpectedly rescinded (returned to the U.S. Treasury as part of the debt ceiling deal). These projects were in various stages of progress, but each was just finishing the first year of the grant and it is fair to say that, overall, a significant amount of work (and grant costs) remained at the time of this rescission.

As I mentioned, NCSER has operated from the perspective that when we make a commitment to funding your grant, we prioritize your continuation costs first before funding new awards or initiatives. In other words, if we are ever in a budget crunch, we will meet our existing commitments first before using money on new activities. Although the ARP funding source was eliminated, our commitment to those FY 2022 ARP-funded grants remained. We chose to use money from our RiSE account to pay for current and future continuation costs for these grants. I hope everyone can understand that this difficult decision honors our standard practice of prioritizing existing commitments.

NCSER’s FY 2023 Research Competition

After accounting for the cost of the continuations that would have otherwise been supported using ARP funds, NCSER’s ability to fund new awards in the FY 2023 grant competition was limited. Further exasperating our new budget shortfall was the much higher than expected (based on past application and funding trends) number of FY 2023 applications that were rated outstanding or excellent. This is a great testament to the field and the work that you all do! Unfortunately, this success came at the same time as this unexpected, very large budget rescission. Something had to give and sadly, what gave was our ability to fund many worthy new grants. It was not a decision made lightly or without thought for those grants left unfunded. I know that many of you are disappointed in this outcome.

It takes a tremendous amount of effort to produce a grant application and we recognize your continued efforts to work with NCSER staff throughout the pre-award process. It is heartbreaking to find out a grant you submitted won’t be funded, despite having such a strong score. NCSER staff were heartbroken with you.

Outlook for FY 2024

What does this all mean for NCSER moving forward? Despite the setback this year, based on available information we have now, NCSER plans to offer research competitions in FY 2024. We are committed to offering new funding opportunities whenever possible to continue the tremendous strides we have made in improving the depth, breadth, and quality of special education research in this country.

NCSER and NCER will be notifying the field very soon regarding FY 2024 competitions, so stay tuned. If you have not done so already, please sign up for our Newsflash to stay current on IES happenings, including the release of new funding opportunities.

Although the challenges we experienced this year certainly were disappointing, I want to end on what I see as the silver lining that emerged from of all of this. Namely, since NCSER’s first research competitions in 2006, the capacity in the field to conduct high-quality research and carry out excellent research training has grown tremendously. We should not forget how far we have come, and how bright NCSER’s future is. Our funding has not (yet!) kept pace with that growth, but that is a subject for another blog…

Please reach out to me at Jacquelyn.Buckley@ed.gov with questions or comments. I'm always happy to hear from you!

New Standards to Advance Equity in Education Research

One year ago, IES introduced a new equity standard and associated recommendations to its Standards for Excellence in Education Research (SEER). The intent of this standard, as well as the other eight SEER standards, is to complement IES’s focus on rigorous evidence building with guidance and supports for practices that have the potential to make research transformational. The addition of equity to SEER is part of IES’s ongoing mission to improve academic achievement and access to educational opportunities for all learners (see IES Diversity Statement). IES is mindful, however, that to authentically and rigorously integrate equity into research, education researchers may need additional resources and tools. To that end, IES hosted a Technical Working Group (TWG) meeting of experts to gather input for IES’s consideration regarding the existing tools and resources that the education community could use as they implement the new SEER equity standard in their research, along with identifying any notable gaps where tools and resources are needed. A summary of the TWG panel discussion and recommendations is now available.

The TWG panel recommended several relevant resources and provided concrete suggestions for ways IES can support education researchers’ learning and growth, including training centers, coaching sessions, webinars, checklists, and new resource development, acknowledging that different researchers may need different kinds of supports. The meeting summary includes both a mix of recommendations for tools and resources, along with important considerations for researchers, including recommendations for best practices, as they try to embed equity in their research. 

The new SEER equity standard and accompanying recommendations have been integrated throughout the current FY 2024 Request for Applications. By underscoring the importance of equity, the research IES supports will both be rigorous and relevant to address the needs of all learners.   


This blog was written by NCER program officer Christina Chhin. If you have questions or feedback regarding the equity TWG, please contact Christina Chhin (Christina.Chhin@ed.gov) or Katina Stapleton (Katina.Stapleton@ed.gov), co-chair of the IES Diversity Council. If you have any questions or feedback regarding the equity standard or associated recommendations, please email NCEE.Feedback@ed.gov.