IES Blog

Institute of Education Sciences

On Being Brief: Skills and Supports for Translating Research to Practice via Brief Reports

Have you ever found yourself at a gathering fumbling to find the words to describe your academic work to family and friends? Do you find it difficult to communicate your scholarship to, and build partnerships with, non-researcher audiences? Are you an early career or seasoned researcher interested in disseminating research to practitioners, policymakers, or community members but struggling to find the best way to do so? Or are you a senior researcher mentoring a trainee through this process?

If your answer to any of these questions is “YES!”, then read on! Writing research briefs is an instrumental part of professional development but, for many researchers, not a formal aspect of training. Drawing on our experience writing research briefs, here are some tips for the challenging, but rewarding, process of translating your research into a brief.

 

Why Write a Brief?

Research briefs deliver the essence of research findings in a relatable manner to a non-researcher audience. Briefs can

  • Broaden your research’s impact by disseminating findings to non-researcher audiences, including communities historically marginalized in research
  • Strengthen university-community partnerships and relationships by transparently communicating with partners
  • Facilitate future partnerships and employment through increased visibility

 

What Exactly IS a Research Brief?

A research brief is a concise, non-technical summary of the key takeaways from a research study. Briefs communicate research insights to the public, thereby translating research and evidence-based practices into real-world settings.  

The focus of a brief varies depending on the intended audience., Provide explicit recommendations for practice if you want to reach a practitioner audience. Explore policy and infrastructure needs when writing for a policymaker audience.

Plan to share briefs in diverse settings. Share briefs with research partners (participating districts, schools, teachers), professional networks (at conference presentations), and broader audiences (on personal websites).

 

Lead researchers on our research team are part of a statewide partnership to support the dissemination of the Positive Behavioral Interventions and Supports framework. This partnership involves researchers and representatives from the Maryland Department of Education, a large behavioral health organization, and all school districts within the state. Researchers regularly write and share briefs with the statewide group, taking into account evolving needs and interests. Check out some of the briefs here.

 

Briefs Should Be…

 

  • Brief. Condensing a full-length manuscript into a two-page document is challenging. But doing so helps distill the study’s real-world implications and identify steps for future work. Two pages is optimal as it can be easily shared as a one-pager when printed.

 

  • Accessible. Graduate-level coursework in statistics should not be required to understand a brief. The usual audience for briefs will not have the time or energy to absorb methodological details or nuanced theory. Write as if you were presenting to a family member or your favorite high school teacher.

 

  • Visually appealing. A visual representation of an idea will capture attention better than text and help with brevity. Your paper likely already has some type of visual (for example, a logic model) that you can tweak. If not, pull from your visual-making skills you have already honed when creating posters and conference presentations! This process may have you re-thinking how you visually present your research, even in peer-reviewed publications.

 

  • A team effort. Individuals bring diverse skills and strengths to the research team. The study’s lead author may be able to articulate results, but a co-author may have the vision to creatively illustrate these findings in a figure. Make use of each member’s skills by making brief-writing an iterative, team effort.

 

  • Tailored to your audience. If you are developing a brief for a specific audience, ensure that key takeaways and recommendations are relevant and actionable. In some cases, you may have a more technical audience to whom you may present the data more formally. In our own experience, district partners have sometimes asked for more numbers and statistics.

 

Building Expertise with Brief Writing

Training in doctoral programs, which often encourages lengthy, detail-oriented writing, runs counter to the skills inherent in writing research briefs. While certain programs offer training for writing for non-academic audiences, we advocate for a greater focus on this skill during graduate training. All of the post-doctoral authors of this blog got their first exposure to writing research briefs on this research team. Inspired by our own on-the-job training, we provide the following recommendations for mentors:  

 

  • Frame writing the brief as an opportunity. Briefs may feel tangential to the graduate student research mission and challenging to existing skillsets. Thus, the process should be framed as an opportunity to develop an integral set of skills to advance professional development. This will help with motivation as well as execution.

 

  • Provide a template for the brief that can be easily tweaked and tailored, so that graduate students have a model for the finished product, minimizing formatting issues. Publisher and Word have visually appealing templates for flyers that can be easily populated and organizations that publish briefs may provide templates and layouts. 

 

  • Know your audience and their interest in the work. The audience should be well-defined (for practitioners, policy makers, or other researchers) and their perspective and interests well-understood. Although knowledge of the audience could come from prior work experience, direct communication with the audience is desirable to gain a firm grasp on their lived experience. If direct interaction is not feasible, mentors should “think aloud” to mentees about which details, words, and images would be most effective and appealing for this audience.  

 

  • Early scaffolding should be followed by continued support. After being a co-author on a brief, a graduate student can transition to writing their own brief. They may still need support to complete this task autonomously, with continued feedback from mentors and co-authors.

 

  • Provide graduate students with targeted experiences and formal training opportunities to facilitate proficiency and efficacy in brief-writing. This might include:
    • University-based or paid workshops for students and early career faculty focused on writing for non-academic audiences
    • Opportunities to interface directly with practitioners

 

Concluding Thoughts

Writing research briefs is a key translational activity for educational researchers, but for many, requires skills not cultivated in formal training. Our research team has embarked on the journey of developing and sharing research briefs regularly over the past few years. This is an evolving and rewarding process for all of us. We hope this post has provided some helpful information as you continue your journey to be brief!

 


Summer S. Braun is a postdoctoral research associate at YouthNex at the University of Virginia’s School of Education and Human Development. She will be joining the Psychology Department at the University of Alabama as an Assistant Professor.

Daniel A. Camacho is a Licensed Clinical Psychologist and a postdoctoral research associate at the University of Virginia School of Education and Human Development.

Chelsea A.K. Duran is a postdoctoral research associate at the University of Virginia School of Education and Human Development in Youth-Nex: The UVA Center to Promote Effective Youth Development. She will be starting a position with the University of Minnesota in the summer of 2021.

Lora J. Henderson is a Licensed Clinical Psychologist and postdoctoral research associate at the University of Virginia who will soon be starting as an assistant professor in the Department of Graduate Psychology at James Madison University.

Elise T. Pas is an Associate Scientist (research faculty) at the Johns Hopkins University, Bloomberg School of Public Health.

*Note: Authors are listed alphabetically and contributed equally to the preparation of this post.

 

National Spending for Public Schools Increases for the Sixth Consecutive Year in School Year 2018–19

NCES just released a finance tables report, Revenues and Expenditures for Public Elementary and Secondary Education: FY19 (NCES 2021-302), which draws from data in the National Public Education Financial Survey (NPEFS). The results show that spending1 on elementary and secondary education increased in school year 2018–19 (fiscal year [FY] 2019), after adjusting for inflation. This is the sixth consecutive year that year-over-year education spending increased since 2012–13. This increase follows declines in year-over-year spending for the prior 4 years (2009–10 to 2012–13).

Current expenditures per pupil2 for the day-to-day operation of public elementary and secondary schools rose to $13,187 in FY19, an increase of 2.1 percent from FY18, after adjusting for inflation (figure 1).3 Current expenditures per pupil also increased over the previous year in FY18 (by 0.9 percent), in FY17 (by 1.7 percent), in FY16 (by 2.8 percent), in FY15 (by 2.7 percent), and in FY14 (by 1.2 percent). In FY19, education spending was 11.8 percent higher than the lowest point of the Great Recession in FY13 and 6.1 percent higher than spending prior to the Great Recession in FY10.


Figure 1. National inflation-adjusted current expenditures per pupil for public elementary and secondary school districts: FY10 through FY19

NOTE: Spending is reported in constant FY19 dollars, based on the Consumer Price Index (CPI).
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), "National Public Education Financial Survey," fiscal years 2010 through 2018 Final Version 2a; and fiscal year 2019, Provisional Version 1a; and Digest of Education Statistics 2019, retrieved January 8, 2021, from https://nces.ed.gov/programs/digest/d19/tables/dt19_106.70.asp.


Without adjusting for geographic cost differences, current expenditures per pupil ranged from $7,950 in Utah to $24,882 in New York (figure 2). In addition to New York, current expenditures per pupil were highest in the District of Columbia ($22,831), New Jersey ($21,331), Vermont ($21,217), and Connecticut ($21,140). In addition to Utah, current expenditures per pupil were lowest in Idaho ($8,043), Arizona ($8,773), Nevada ($9,126), and Oklahoma ($9,203).


Figure 2. Current expenditures per pupil for public elementary and secondary education, by state: FY19

NOTE: These data are not adjusted for geographic cost differences.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Common Core of Data (CCD), “National Public Education Financial Survey (NPEFS),” FY19, Provisional Version 1a and “State Nonfiscal Survey of Public Elementary/Secondary Education,” school year 2018–19, Provisional Version 1a.


These new NPEFS data offer researchers extensive opportunities to investigate state and national patterns of revenues and expenditures. Explore the report and learn more.


[1] Spending refers to current expenditures. Current expenditures comprise expenditures for the day-to-day operation of schools and school districts for public elementary/secondary education, including expenditures for staff salaries and benefits, supplies, and purchased services. Current expenditures include instruction, instruction-related support services (e.g., social work, health, psychological services), and other elementary/secondary current expenditures but exclude expenditures on capital outlay, other programs, and interest on long-term debt.
[2] Per pupil expenditures are calculated using student membership derived from the State Nonfiscal Survey of Public Elementary/Secondary Education. In some states, adjustments are made to ensure consistency between membership and reported fiscal data. More information on these adjustments can be found in the data file documentation at https://nces.ed.gov/ccd/files.asp.
[3] In order to compare spending from one year to the next, expenditures are converted to constant dollars, which adjusts figures for inflation. Inflation adjustments utilize the Consumer Price Index (CPI) published by the U.S. Department of Labor, Bureau of Labor Statistics. For comparability to fiscal education data, NCES adjusts the CPI from a calendar year to a school fiscal year basis (July through June). See Digest of Education Statistics 2019, table 106.70, retrieved January 8, 2021, from https://nces.ed.gov/programs/digest/d19/tables/dt19_106.70.asp.

 

By Stephen Q. Cornman NCES; Lei Zhou, Activate Research; and Malia Howell, U.S. Census Bureau

Students’ Access to the Internet and Digital Devices at Home

This blog continues a robust discussion about National Center for Education Statistics (NCES) data collected in the recent past that can illuminate the issue of students’ access to the internet and digital devices at home. A few years ago—well before the coronavirus pandemic and stay-at-home orders shone a bright light on the inequities across the nation—NCES began dedicating resources to improve its data collection and policymaking around education technology and equity at the district, state, and national levels.

The 2019 National Assessment of Educational Progress (NAEP) reading questionnaire asked 4th- and 8th-grade students if they had internet access at home and if there was a computer or tablet at home that they could use (referred to in this blog as having “digital access”). These data provide a pre–coronavirus pandemic snapshot of students’ digital access. Across all public schools, 81 percent of 4th-grade students and 88 percent of 8th-grade students said that they had digital access (figures 1 and 2). Thus, 19 percent of 4th-grade students and 12 percent of 8th-grade students in public schools may not have either access to the internet or the devices required to carry out distance learning.  


Figure 1. Percentage of 4th-grade public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by state: 2019

* Significantly different from the National Public estimate at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


Figure 2. Percentage of 8th-grade public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by state: 2019

* Significantly different from the National Public estimate at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


There were also differences across states in 2019. For 4th-grade students, the percentages who had digital access varied by state, ranging from 70 percent in New Mexico to 88 percent in New Jersey (table 1). Arizona, Arkansas, Idaho, Kansas, Mississippi, Missouri, New Mexico, Oklahoma, Oregon, Tennessee, Texas, and Wyoming had lower percentages of students who had digital access than the national average (figure 1 and table 1). For 8th-grade students, the percentages who had access ranged from 81 percent in Oklahoma to 93 percent in Connecticut (table 1). Alabama, Arizona, Arkansas, Hawaii, Kentucky, Louisiana, Mississippi, Nevada, Oklahoma, Tennessee, Texas, and West Virginia had lower percentages of students who had access than the national average (figure 2 and table 1).


Table 1. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and state: 2019

 

Grade 4

 

Grade 8

 

State

Percent

s.e

 

Percent

s.e

 

   National public

81

(0.2)

 

88

(0.2)

 

Alabama

79

(1.2)

 

86

(0.8)

Alaska

 

 

Arizona

78

(0.9)

84

(0.9)

Arkansas

73

(0.9)

83

(1.1)

California

81

(0.9)

 

88

(0.9)

 

Colorado

 

 

Connecticut

85

(0.8)

93

(0.6)

Delaware

81

(0.9)

 

90

(0.6)

 

District of Columbia

83

(0.8)

90

(0.6)

DoDEA

88

(0.7)

96

(0.4)

Florida

85

(0.7)

89

(0.7)

 

Georgia

83

(0.9)

90

(0.7)

Hawaii

79

(1)

 

86

(0.8)

Idaho

77

(0.9)

88

(0.8)

 

Illinois

83

(0.8)

90

(0.6)

Indiana

80

(0.9)

 

90

(1.1)

 

Iowa

81

(0.9)

 

90

(0.7)

 

Kansas

78

(0.9)

88

(0.7)

 

Kentucky

81

(0.8)

 

87

(0.7)

Louisiana

79

(1)

 

85

(0.9)

Maine

82

(0.9)

 

89

(0.7)

 

Maryland

82

(0.8)

 

91

(0.6)

Massachusetts

87

(0.8)

93

(0.7)

Michigan

80

(1)

 

90

(0.8)

 

Minnesota

83

(1)

92

(0.7)

Mississippi

77

(1.2)

84

(0.7)

Missouri

78

(0.8)

89

(0.8)

 

Montana

 

 

Nebraska

81

(0.9)

 

90

(0.7)

Nevada

79

(1)

 

85

(0.7)

New Hampshire

 

 

New Jersey

88

(0.8)

93

(0.6)

New Mexico

70

(1.2)

82

(0.8)

New York

84

(0.7)

91

(0.7)

North Carolina

81

(0.8)

 

89

(0.8)

 

North Dakota

81

(1)

 

90

(0.7)

Ohio

82

(0.9)

 

91

(0.7)

Oklahoma

73

(1.1)

81

(0.9)

Oregon

77

(1)

87

(0.8)

 

Pennsylvania

85

(0.8)

91

(0.7)

Rhode Island

84

(0.8)

90

(0.6)

South Carolina

81

(1)

 

90

(0.9)

 

South Dakota

 

 

Tennessee

77

(0.9)

86

(0.9)

Texas

75

(0.9)

82

(1)

Utah

 

 

Vermont

81

(0.9)

 

91

(0.7)

Virginia

82

(0.8)

 

91

(0.8)

Washington

80

(1)

 

89

(0.8)

 

West Virginia

81

(1)

 

86

(0.7)

Wisconsin

83

(0.9)

 

91

(0.7)

Wyoming

78

(0.9)

88

(0.7)

 

↑ Significantly higher than the estimate for National Public at the .05 level of statistical significance.
↓ Significantly higher than the estimate for National Public at the .05 level of statistical significance.
‡ Reporting standards not met. Sample size insufficient to permit a reliable estimate.
† Not applicable.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant. “National public” refers to the results for all students in public schools.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


Looking at the results of NAEP’s 2019 Trial Urban Districts Assessment (TUDA), Miami-Dade, Florida, had the highest percentages of 4th- and 8th-grade students who had digital access (88 percent and 93 percent, respectively) (table 2). Fresno, California, had the lowest percentage of 4th-grade students (67 percent) who had access and Dallas, Texas, had the lowest percentage of 8th-grade students (73 percent) who had access.


Table 2. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and Trial Urban District Assessments (TUDA): 2019

 

Grade 4

 

Grade 8

 

Large city

Percentage

 

Percentage

 

   All large cities

78

 

85

 

Albuquerque

75

 

85

 

Atlanta

82

86

 

Austin

78

 

83

 

Baltimore City

73

84

 

Boston

81

89

Charlotte

83

91

Chicago

80

 

88

 

Clark County (NV)

78

 

84

 

Cleveland

74

80

Dallas

71

73

Denver

 

 

Detroit

70

79

District of Columbia (DCPS)

83

90

Duval County (FL)

84

89

Fort Worth (TX)

72

88

Fresno

67

77

Guilford County (NC)

78

 

85

 

Hillsborough County (FL)

81

 

87

 

Houston

71

75

Jefferson County (KY)

82

88

Los Angeles

76

 

85

 

Miami-Dade

88

93

Milwaukee

75

 

85

 

New York City

81

 

89

Philadelphia

78

 

86

 

San Diego

81

 

90

Shelby County (TN)

78

 

86

 

Significantly higher than the estimate for Large City at the .05 level of statistical significance.
↓ Significantly lower than the estimate for Large City at the .05 level of statistical significance.
‡ Reporting standards not met. Sample size insufficient to permit a reliable estimate.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


In 2019, higher percentages of 8th-grade students than of 4th-grade students had digital access. This pattern was consistent across all states and TUDA jurisdictions. On average, in both 4th and 8th grades, higher percentages of students in suburban areas than of students in cities, towns, and rural areas had access (table 3).


Table 3. Percentage of public school students in the NAEP reading assessment that reported having internet access and a computer or tablet at home, by grade and locale: 2019

 

Grade 4

 

Grade 8

 

Locale

Percentage

s.e

 

Percentage

s.e

 

   National public

81

(0.2)

 

88

(0.2)

 

City

79

(0.4)

86

(0.4)

Suburban

84

(0.3)

 

92

(0.3)

 

Town

77

(0.8)

86

(0.6)

Rural

78

(0.4)

87

(0.4)

↓ Significantly lower than the estimate for Suburban at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2019 Reading Assessment.


While the NAEP data reveal state-level patterns in students’ digital access before the pandemic, the Household Pulse Survey (HPS) provides insight into the digital access of students across the country during the pandemic. The HPS is conducted by the Census Bureau and seven other federal statistical agency partners, including NCES. Since April 23, 2020, the HPS has provided weekly or biweekly estimates of the availability of computers and internet access to children for educational purposes.

In April 2020, 88 percent of adults who had children under 18 in the home enrolled in school reported that computers were always or usually available for educational purposes. By the end of March 2021, that percentage increased to 94 percent (table 4).

A similar pattern emerged in the HPS data for internet access. In April 2020, 91 percent of adults who had children under 18 in the home enrolled in school reported that the internet was always or usually available for educational purposes. In March 2021, that percentage had increased to 94 percent (table 4).


Table 4. Percentage of adults who had children under 18 in the home enrolled in school who reported that computers and internet access were always or usually available for educational purposes: 2020–21, selected time periods

 

Computers available

Access to internet

 

Percentage

s.e.

 

Percentage

s.e.

 

April 23 to May 5, 2020

88

(0.5)

 

91

(0.4)

 

March 17 to March 29, 2021

94

(0.4)

94

(0.4)

↑ Significantly higher than the estimate for April 23 to May 5, 2020, at the .05 level of statistical significance.
NOTE: Statistical comparison tests are based on unrounded numbers. Not all apparent differences between estimates are statistically significant.
SOURCE: U.S. Department of Commerce, Census Bureau, Household Pulse Survey, selected periods, April 2021 through March 2021.


While these data provide a recent look into the technology landscape for students both before and during the pandemic, there is still a need to collect more and better data to understand digital inequities. For example, future NCES surveys could ask schools, students, and teachers about their technology use and access at home, what resources for learning and instruction they have at home, and the environment in which many students and teachers now find themselves learning and teaching.

 

Resources for more information:

 

By Cadelle Hemphill, AIR; Yan Wang, AIR: Diana Forster, AIR; Chad Scott, AIR; and Grady Wilburn, NCES

The Growing Reading Gap: IES Event to Link Knowledge to Action Through Literacy Data

On June 8 and 9, the Institute of Education Sciences (IES) and the Council of the Great City Schools (CGCS) will host a Reading Summit to address one of the most important issues confronting American education today: the declining reading performance of America’s lowest-performing students and the growing gap between low- and high-performing students.

At this 2-day virtual event, participants will explore the results of the National Assessment of Educational Progress (NAEP), as well as other IES data, and learn strategies to help educators and low-performing readers make progress.

Learn more about the summit’s agenda and speakers—including IES Director Mark Schneider, NCES Commissioner James L. Woodworth, and NCES Associate Commissioner Peggy Carr—and register to participate (registration is free).

In the meantime, explore some of the data NCES collects on K–12 literacy and reading achievement, which show that the scores of students in the lowest-performing groups are decreasing over time.

  • The National Assessment of Educational Progress (NAEP) administers reading assessments to 4th-, 8th-, and 12th-grade students. The most recent results from 2019 show that average reading scores for students in the 10th percentile (i.e., the lowest-performing students) decreased between 2017 and 2019 at grade 4 (from 171 to 168) and grade 8 (from 219 to 213) and decreased between 2019 and 2015 at grade 12 (from 233 to 228).
  • The Progress in International Reading Literacy Study (PIRLS) is an international comparative assessment that measures 4th-grade students’ reading knowledge and skills. The most recent findings from 2016 show that the overall U.S. average score (549) was higher than the PIRLS scale centerpoint (500), but at the 25th percentile, U.S. 4th-graders scored lower in 2016 (501) than in 2011 (510).
  • The Program for International Student Assessment (PISA) is a study of 15-year-old students’ performance in several subjects, including reading literacy. The 2018 results show that, although the overall U.S. average reading score (505) was higher than the OECD average score (487), at the 10th percentile, the U.S. average score in 2018 (361) was not measurably different from the score in 2015 and was lower than the score in 2012 (378).

NCES also collects data on young children’s literacy knowledge and activities as well as the literacy competencies of adults. Here are a few data collections and tools for you to explore:

This year, the Condition of Education includes a newly updated indicator on literacy activities that parents reported doing with young children at home. Here are some key findings from this indicator, which features data from the 2019 NHES Early Childhood Program Participation Survey:

In the week before the parents were surveyed,

  • 85 percent of 3- to 5-year-olds were read to by a family member three or more times.
  • 87 percent of 3- to 5-year-olds were told a story by a family member at least once.
  • 96 percent of 3- to 5-year-olds were taught letters, words, or numbers by a family member at least once.

In the month before the parents were surveyed,

  • 37 percent of 3- to 5-year-olds visited a library with a family member at least once.

Be sure to read the full indicator in the 2021 Condition of Education, which was released in May, for more data on young children’s literacy activities, including analyses by race/ethnicity, mother’s educational attainment, and family income.

Don’t forget to follow NCES on Twitter, Facebook, and LinkedIn to stay up-to-date on the latest findings and trends in literacy and reading data and register for the IES Reading Summit to learn more about this topic from experts in the field. 

 

By Megan Barnett, AIR

IES Supported Intervention “INSIGHTS Into Children’s Temperament” is Featured at the 2021 ED Games Expo

The ED Games Expo is an annual showcase of game-changing innovations in education technology developed through programs at ED and across the federal government. Since 2013, the Expo has been an in-person event at venues across Washington, D.C. Because of COVID-19, the 2021 Expo will be an entirely virtual experience from June 1 to 5.

This year, the Expo will showcase more than 160 learning games and technologies and feature 35 different virtual EdTech events of interest to a broad audience of viewers. See the Agenda for the lineup for the Ed Games Expo.

 

ED Games Expo: Featuring INSIGHTS into Children’s Temperament

INSIGHTS into Children’s Temperament, an IES-supported intervention, is being featured at the Expo this year. INSIGHTS supports children’s social-emotional development and academic learning by helping teachers and parents see how differences in children’s behavior might reflect temperament/personality. Children work with the INSIGHTS puppets and learn that other children and adults react differently to the same situation due to their temperaments. IES has supported two randomized controlled trials (RCTs, the “gold standard” for claims of impact) of INSIGHTS – one in New York City and the other (ongoing) in rural Nebraska. Evidence from the NYC RCT and a longitudinal follow up indicate that children who participate in the INSIGHTS program during early elementary school experience better academic and social behavioral outcomes immediately following participation in the program, and these positive impacts persist into middle school. 

 

During the 2020 ED Games Expo, Sandee McClowry and her team performed an INSIGHTS lesson at the Kennedy Center to hundreds of attendees, including children, students, and families. INSIGHTS will be featured in this year’s ED Games Expo in three ways.

  • Tuesday, June 1 at 8PM Eastern: There will be an “ED Games Expo Kick Off Show” hosted by the puppets from the INSIGHTS intervention and the characters from the Between the Lions children’s television program. All of the characters will share information about the ED Games Expo while having a lot of fun and hijinks on a road trip to Washington, DC.  The Show will be introduced by the Secretary of Education, Miguel Cardona, and will also feature cameo appearances by IES, ED, and government team members.
  • Wednesday, June 2 from 9PM to 9:45PM Eastern: Sandee McClowry will be hosting a Master Class for Educators. The event will introduce all of the INSIGHTS friends, including Coretta the Cautious, Gregory the Grumpy, Fredrico the Friendly, and Hilary the Hard Worker. The video will provide practical guidance to educators on how to deliver the intervention in a classroom. The event will conclude with a rich and engaging discussion with expert practitioners about how INSIGHTS addresses the social and emotional learning of children, educators, and parents. Click Here to access the YouTube broadcast of the Master Class and set a reminder to watch on June 1.
  • Materials from INSIGHTS, including puppets that can be printed out and professional development resources for educators, will be available to try out during the Expo and in the month of June.

 

For URL links to watch the ED Games Expo Kick Off Show and Master Class for Educators, See the Agenda. For more information and on how to access the resources INSIGHTS intervention, see the website.


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research at IES