IES Blog

Institute of Education Sciences

ED/IES SBIR: Advancing Research to Practice at Scale in Education

This image depicts a young girl with headphones holding onto a mic that is attached.

The Department of Education and Institute of Education Sciences Small Business Innovation Research Program (known as ED/IES SBIR), funds projects to develop and evaluate new education technology products that ready to be widely deployed to address pressing educational needs.

In advance of IES Innovation Day at the ED Games Expo on September 21, 2023 at the Kennedy Center REACH in Washington, DC, this blog features a series of ED/IES SBIR awards that were funded for the purpose of creating education technology to ready previously funded evidence-based products for use at scale. Two of the projects highlighted below, one led by Jay Connor of Learning Ovations and the other by Clark McKown of xSEL Labs, will be featured as part of panels. This event is open to the public. Register for the Expo here.


Over its 20-year history, ED/IES SBIR has been well known for stimulating pioneering firms, such as Filament Games, Future EngineersPocketLab, and Schell Games, to create entrepreneurial and novel education technology products. ED/IES SBIR has also established a track record for investing in a different set of projects—ones that facilitate the uptake of innovations originally developed in university or laboratory settings. This is important because even when researcher-developed innovations (for example, models, programs, and tools) are shown to have evidence for impact, many are not delivered at scale, preventing learners from fully benefiting from these innovations.

Examples of ED/IES SBIR Research to Practice Projects

Over the past two decades, ED/IES SBIR projects have provided useful models for how researchers can navigate and overcome the research-to-practice gap. ED/IES SBIR has made several awards to projects that were originally researcher-initiated, many through IES research grants. These researchers either founded a small business or partnered with an existing small business to develop and commercialize new education technology products to advance research to practice at scale in education.

The short descriptions of these projects below include links to IES website pages with additional information on the unique project models. These projects converted findings from research into scalable, education technology delivered interventions, added new components to existing research-based prototypes to enable feasible implementation and to improve the user experience, and upgraded technology systems to handle large numbers of users across numerous sites.

  • Learning Ovations: Through a series of IES and NIH funded research, Dr. Carol Connor led an academic team to develop a personalized early learning assessment, the A2i, and demonstrated its efficacy for improving literacy outcomes through multiple trials. To ready the A2i for use in larger numbers of settings and to improve data processing and reporting, Learning Ovations won an ED/IES SBIR award to upgrade the underlying data architecture and create automated supports and functionalities. In 2022, Scholastic acquired Learning Ovations, with plans for the A2i to be integrated into its suite of products. See the Learning Ovations Success Story for more information.
  • Mindset Works: Through an IES research grant in 2002 and with funding from other sources, Dr. Carol Dweck led a research team to develop the concept of the growth mindset—the understanding that ability and intelligence can develop with effort and learning. Lisa Blackwell, a member of the research team, founded Mindset Works and won a 2010 ED/IES SBIR award to develop training modules and animated lessons to deploy this instructional model through a multi-media website. A research grant funded in 2015 tested and demonstrated the efficacy of the technology-delivered Growth Mindset Intervention to improve outcomes of struggling learners. See the Mindset Works Success Story for more information.
  • Nimble Assessment Systems: Through IES and other grants, Dr. Michael Russell led team of researchers to conducted foundational research and develop and validation of new forms of assessment. Informed by this research, Nimble Assessment Systems developed NimbleTools with an award from a ED/IES SBIR, a set of universally designed accommodation tools to improve accessibility of assessments for students with disabilities. Measured Progress acquired Nimble Assessment Systems, and the product was integrated into its suite of products for state and district assessments. See the Nimble Tools Success Story for more information.
  • Children’s Progress: Through NIH grants, Dr. Eugene Galanter led a research team to create a computer-based assessment that adapted to how a student responded to each question and delivered individualized narratives for each student. With awards from NIH SBIR and ED/IES SBIR, Children’s Progress developed a commercial version of the computer-adaptive dynamic assessment (CPAA) for early childhood in literacy and math. In 2012, Northwest Evaluation Association (NWEA) acquired Children’s Progress, with the assessment technology incorporated into the NWEA’s assessment platform and used at scale. See the Children’s Progress Success Story for more information.
  • Teachley: Through IES and NSF funded research, Dr. Herb Ginsburg led an academic team to develop prototype software programs for children from preschool to grade 3 to practice mathematics. In 2011, three members of the research team founded a small business, Teachley, which won ED/IES SBIR awards to extend the research model into easily playable, engaging, and widely used math game apps. See the Teachley Success Story for more information.
  • Analytic Measures: With funding from IES, Dr. Jaren Bernstein led a research team to develop prototypes of automated oral reading fluency assessments that were administered to students during the NAEP and other national assessments by IES’s National Center for Education Statistics. Analytic Measures won ED/IES SBIR awards (here and here) to develop the school-ready version of these assessments. In 2022, Google acquired the intellectual property of the assessments with plans to incorporate the tools into its suite of products for education. See this Analytic Measures Success Story more information.
  • Lightning Squad: Through awards from ED’s Office of Education Research and Improvement (now IES) and the Office of Elementary and Secondary Education, Drs. Nancy Madden and Bob Slavin led a research team to develop a model to make tutoring more cost-effective. With awards from ED/IES SBIR, Sirius Thinking partnered with Success For All to develop a mixed online and face-to-face multimedia intervention for struggling readers in grades 1 to 3. The program is now in wide-scale use in schools and in tutoring programs. See the Lightning Squad Success Story for more information.
  • Apprendis: With research grants from IES and other sources, Dr. Janice Gobert led teams at Worcester Polytechnic Institute and Rutgers University to develop and evaluated Inq-ITS (Inquiry Intelligent Tutoring System) virtual labs for students in grades 4 to 10. Apprendis was founded in order to commercialize InqITS and won an ED/IES SBIR award to develop a teacher alert system that generates real-time insights to inform instruction. InqITS is currently in wide-scale use.
  • Common Ground Publishing: Through IES and other grants, Drs. Bill Cope and Mary Kalantzis led a team of researchers to conduct research on new forms of technology-delivered formative assessment for student writing. A technology-based company spun out of a university tech-transfer office, Common Ground Publishing, and won ED/IES SBIR awards (here and here) to develop CGScholar based on this research. CGScholar is an AI-based digital media learning management system designed to support student writing, learning, and formative assessment, which has been in wide-scale use for several years.  See the CGScholar Success Story for more information.
  • xSEL Labs: With funding from IES, Dr. Clark McKown led a team led to develop screening assessments for social and emotional learning and conducted research to demonstrate the efficacy of the tool. xSEL Labs was founded to commercialize the assessments, and with an ED/IES SBIR award, is developing a platform to support educators and administrators using research-based SEL assessments. In 2023, 7 Mindsets acquired xSEL Labs was acquired to commercialize the platform at scale.

A New Program Area at ED/IES SBIR to Continue Advancing Research to Practice
With a history of awards to advance research to practice, ED/IES SBIR created a new program area in 2022 called Direct to Phase II to invest in more projects to develop commercially viable education technology products to ready existing evidence-based research for use at scale. The program resulted in one award (see here) in 2022. Please see the ED/IES SBIR solicitation page for information on the next opportunity for funding through its FY2024 program.


Stay tuned for updates on Twitter, Facebook, and LinkedIn as ED/IES SBIR continues to support projects to advance research to practice at scale.

Edward Metz (Edward.Metz@ed.govis a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences.

 

Adult Foundational Skills Research: Reflections on PIAAC and Data on U.S. Adult Skills

In this blog, NCER program officer, Dr. Meredith Larson, interviews Dr. Holly Xie from NCES about the Program for the International Assessment of Adult Competencies (PIAAC), an OECD-developed international survey  of adult skills in literacy, numeracy, and digital problem solving administered at least once a decade. PIAAC also collects information on adult activities (such as skill use at home or work, civic participation, etc.), demographics (such as level of education, race), and other factors (such as health outcomes). To date, NCER has funded three research grants (herehere, and here) and one training grant that relied on PIAAC data.

NCES has led the U.S. efforts in administering PIAAC and has been sharing results for over a decade. PIAAC in Cycle I (PIAAC I) included three waves of data collection in the United States with the first data released in 2013. From PIAAC I, we learned a wealth of information about the skills of U.S. adults. For example, the 2017 wave of data collection found that the percentages of U.S. adults performing at the lowest levels were 19 percent in literacy, 29 percent in numeracy, and 24 percent in digital problem solving. As we look forward to learning from PIAAC II, Dr. Xie reflects on the products from PIAAC I and possibilities for PIAAC II (release in 2024).

What is your role at NCES and with PIAAC specifically?

I am the PIAAC national program manager and oversee all aspects of the PIAAC study in the United States, including development and design, data collection, analysis and reporting, and dissemination/outreach. I also represent the United States at PIAAC international meetings.

What is something you’re particularly excited about having produced during PIAAC I?

I am most excited about the U.S. PIAAC Skills Map. The Skills Map provides information on adult skills at the state and county levels. Users can explore adult skills in literacy and numeracy in their state or county and get estimates of literacy or numeracy proficiency overall and by age and education levels. Or they can compare a county to a state, a state to the nation, or compare counties (or states) to each other. The map also has demographic and socioeconomic data from the American Community Survey (ACS) to provide context for the state or county estimates. This YouTube video demonstrates what the map can do.

 

 

We also have other PIAAC web products and publications such as national and international reports, Data Points, and PIAAC publications that provide invaluable information on U.S. adult skills and interrelationships of those skills to other social, economic, and demographic factors.

Do you have examples of how information from PIAAC I has been used?

PIAAC data cover results at the national, state, and county levels, and as such, they can be useful for policymakers or decision makers who would like to know where things stand in terms of the skills of their adult population and where they need to allocate resources at these different levels of the system. In other words, PIAAC data can be useful for drafting targeted policies and programs that will benefit their population and constituencies.

For instance, at the national level, former Vice President Biden used information from PIAAC I in his report Ready to Work for the June 2014 reauthorization of the Workforce Innovation and Opportunity Act, known as WIOA. PIAAC was also cited in the discussion of extending the Second Chance Pell experiment as identified in the 2019 report titled Prisoners’ Eligibility for Pell Grants: Issues for Congress.

The Digital Equity Act of 2021 also leveraged the PIAAC. This legislation identifies particular populations that determine the funding formula. The quick guide to these populations uses PIAAC to estimate one of these populations: Individuals with a language barrier, including individuals who are English learners and have low levels of literacy.

Local governments have also used PIAAC products. For example, the Houston Mayor’s Office for Adult Literacy in collaboration with the Barbara Bush Foundation used the PIAAC Skills Map data to inform the Adult Literacy Blueprint.

And the adult education advocacy group, ProLiteracy, also used the PIAAC and the Skills Map to develop a toolkit for local program adult education and adult literacy program advocacy.

When will the results of PIAAC II be available, and how does this cycle differ from PIAAC I?

PIAAC II data collection began in 2022 and results will be released in December 2024 and will include information on the literacy, numeracy, and adaptive problem-solving skills of adults in the United States. The numeracy assessment now includes a measure of “numeracy components,” which focus on number sense, smaller/bigger number values, measurement, etc. This information will help us learn more about the skills of adults who have very low numeracy skills. The adaptive problem-solving component is a new PIAAC module and will measure the ability to achieve one’s goals in a dynamic situation in which a method for reaching a solution is not directly available.

PIAAC II will also include, for the first time, questions about financial literacy in the background questionnaire, using items on managing money and tracking spending and income, savings methods, and budgeting. These additional questions will allow people to explore relationships between foundational skills, financial literacy, and other constructs in PIAAC.

What types of research could you imagine stemming from the PIAAC II?

One of the most unique features of PIAAC (both PIAAC I and II) is the direct assessment of literacy, numeracy, and problem-solving skills (information that no other large-scale assessment of adults provides). Thirty-one countries, including the United States, participated in PIAAC II (2022/23), so researchers will be able to compare the adult skills at the international level and also study trends between PIAAC I and PIAAC II.

It’s worth noting that the data collection took place while we were still experiencing the effects of the COVID-19 pandemic. This may provide researchers opportunities to explore how the pandemic is related to adults’ skills, health, employment, training, and education status.

Where can the public access data from PIAAC?

Researchers can find information about the available data from the national U.S. PIAAC 2017 Household, PIAAC 2012/14 Household, and PIAAC 2014 Prison datasets, and international and trend datasets on the NCES Data Files page. PIAAC restricted-use data files contain more detailed information, such as continuous age and earnings variables, that can be used for more in-depth analysis. Accessing the restricted-use data requires a restricted-use license from NCES.

NCES also has an easy-to-use online analysis tool: the International Data Explorer (IDE). The IDE allows users to work directly with the PIAAC data and produce their own analyses, tables, regressions, and charts. An IDE tutorial video provides comprehensive, step-by-step instructions on how to use this tool. It contains detailed information about the content and capabilities of the PIAAC IDE, as well as how the PIAAC data are organized in the tool.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer for postsecondary and adult education, NCER.

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.

 

Recommendations for Using Social Media for Effective Dissemination of Education Research

When it comes to using research to inform practice, teachers tend to want succinct tips and strategies that can work in their own classrooms. Researchers can use social media channels to tailor their messages from their research findings and disseminate where teachers are already active. In this guest blog, Dr. Sam Van Horne, University of Delaware, describes the work that researchers conducted as part of the originally IES-funded Center for Research Use in Education. The goal of the center was to understand the gaps between researcher and practitioner perspectives on the usefulness of research in practice so that the center can address issues around how researcher communicate about their research, how practitioners can use research more effectively in their classrooms, and how to build stronger connections between the two communities.

Using a large cross-sectional survey of school-based practitioners, we found that practitioners report consuming research through multiple channels, and more than half of reported using social media in the last year with the goal of improving their practice. Social media channels, therefore, provide education researchers with an opportunity to connect with practitioners, but where are researchers likely to find teachers on social media? And how can researchers distill their findings for sharing in mediums that are vastly different than traditional academic forms? Here are some recommendations based on our research.

  • Finding and Connecting with Educators on Social Media: One review of research about social media use among teachers found that Facebook and Twitter are some of the main sites that teachers use. But teachers also use Pinterest and Instagram as methods for learning from other teachers about teaching strategies. Posting in multiple channels may make it more likely that a message can reach educators. To find educators, researchers can search for public lists on education-focused topics or see who is using hashtags like #edtwitter, #TeachersofInstagram, or #EduTooters. By following lists, researchers can efficiently find educators to follow and tag (i.e., add the educator’s username to a social media message) with their messages about research-informed practice. This can aid with directly engaging practitioners and beginning conversations about applying research to practice.
  • Using Hashtags or Tagging Specific People on Social Media: Social media networks like Twitter can overwhelm users with the volume of content being shared, so it’s critical to use tools like hashtags to find a practice-focused community who may be interested in applying research findings. Users search for content with hashtags that are discipline specific or likely to reach educators, such as #edutooters on Mastodon, #edutwitter on Twitter, or #teachersofinstagram or #teachersfollowteachers on Instagram. The key is identifying teachers or knowledge brokers (i.e., people or organizations who support practitioners in applying research evidence to teaching practice) that may be interested in the message and who may retweet or boost the message to their own followers.
  • Tailoring Messages to Focus on What Practitioners Can Do: When the audience is identified, researchers can ask themselves, “What do I want this group to consider doing based on these research findings?” Then, social media messages can incorporate those ideas rather than just summarizing research findings. Social media messages describing how research can inform education practice should be economical and capture interest. Links to the original paper can be appended to a post for those who want to read more.
  •  When possible, include links to publications or resources in publicly available repositories and not to versions in subscription-based journals. IES grantees can increase the visibility of their research by submitting their publications as grantee submissions in ERIC. This not only fulfills public-access requirements but also gives practitioners access to important information for improving teaching practice.  
  • Incorporating Visual Elements to Attract Attention to Education Research Findings: Messages that incorporate visual elements or video are better suited for sharing on social media. The visual abstract is a succinct summary of research findings that is well-suited for sharing in social media platforms, and researchers have found that visual abstracts are more often shared on social media platforms than plain text about research. You can find guidance on creating visual abstracts here, though the authors suggest collaborating with a designer. These visual abstracts are suited for visual platforms like Pinterest or Instagram. Some journals make a regular practice of posting brief video messages from authors who explain their research study and the significance of the findings. Animations can also attract more attention to messages about research.

Disseminating education research on social media is not a “one-and-done” practice but should be part of a professional social media presence. Many guides exist for developing a professional social media presence, such as these for Twitter and LinkedIn. In addition to posting about research and its implications for practice, researchers can post about research or general issues in the field. This helps with building a following that will be more likely to see posts about research. There are other benefits to disseminating research on social media channels, including providing researchers with metrics about how many times their messages are shared or retweeted (or boosted, on Mastodon), as well as enabling research about optimal ways to share research to reach the broadest audience. In fact, Dr. Farley-Ripple, a leader of CRUE, and colleagues have received funding from the National Science Foundation for a research study to investigate the effectiveness of different dissemination strategies on social media, including the effectiveness of the translational visual abstract.

Connecting with educators on social media is a process. Researchers can begin by creating a presence on social media networks where educators are found and then post often about education and use hashtags to help make messages visible to educators. Messages can be succinct posts that include recommendations or strategies that are feasible for educators to adopt or include multimedia messages like the translational visual abstract to attract attention in a medium that is suited to visuals. Over time, it’s possible to learn what works and what doesn’t and adapt strategies for reaching educators, while keeping in mind that the tools and networks available now will undoubtedly adapt and change themselves.


Sam Van Horne is a Data Scientist at the Center for Research Use in Education and the Center for Research in Education and Social Policy at the University of Delaware.

This blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), program officer, NCER.