IES Blog

Institute of Education Sciences

IES Supported Intervention “INSIGHTS Into Children’s Temperament” is Featured at the 2021 ED Games Expo

The ED Games Expo is an annual showcase of game-changing innovations in education technology developed through programs at ED and across the federal government. Since 2013, the Expo has been an in-person event at venues across Washington, D.C. Because of COVID-19, the 2021 Expo will be an entirely virtual experience from June 1 to 5.

This year, the Expo will showcase more than 160 learning games and technologies and feature 35 different virtual EdTech events of interest to a broad audience of viewers. See the Agenda for the lineup for the Ed Games Expo.

 

ED Games Expo: Featuring INSIGHTS into Children’s Temperament

INSIGHTS into Children’s Temperament, an IES-supported intervention, is being featured at the Expo this year. INSIGHTS supports children’s social-emotional development and academic learning by helping teachers and parents see how differences in children’s behavior might reflect temperament/personality. Children work with the INSIGHTS puppets and learn that other children and adults react differently to the same situation due to their temperaments. IES has supported two randomized controlled trials (RCTs, the “gold standard” for claims of impact) of INSIGHTS – one in New York City and the other (ongoing) in rural Nebraska. Evidence from the NYC RCT and a longitudinal follow up indicate that children who participate in the INSIGHTS program during early elementary school experience better academic and social behavioral outcomes immediately following participation in the program, and these positive impacts persist into middle school. 

 

During the 2020 ED Games Expo, Sandee McClowry and her team performed an INSIGHTS lesson at the Kennedy Center to hundreds of attendees, including children, students, and families. INSIGHTS will be featured in this year’s ED Games Expo in three ways.

  • Tuesday, June 1 at 8PM Eastern: There will be an “ED Games Expo Kick Off Show” hosted by the puppets from the INSIGHTS intervention and the characters from the Between the Lions children’s television program. All of the characters will share information about the ED Games Expo while having a lot of fun and hijinks on a road trip to Washington, DC.  The Show will be introduced by the Secretary of Education, Miguel Cardona, and will also feature cameo appearances by IES, ED, and government team members.
  • Wednesday, June 2 from 9PM to 9:45PM Eastern: Sandee McClowry will be hosting a Master Class for Educators. The event will introduce all of the INSIGHTS friends, including Coretta the Cautious, Gregory the Grumpy, Fredrico the Friendly, and Hilary the Hard Worker. The video will provide practical guidance to educators on how to deliver the intervention in a classroom. The event will conclude with a rich and engaging discussion with expert practitioners about how INSIGHTS addresses the social and emotional learning of children, educators, and parents. Click Here to access the YouTube broadcast of the Master Class and set a reminder to watch on June 1.
  • Materials from INSIGHTS, including puppets that can be printed out and professional development resources for educators, will be available to try out during the Expo and in the month of June.

 

For URL links to watch the ED Games Expo Kick Off Show and Master Class for Educators, See the Agenda. For more information and on how to access the resources INSIGHTS intervention, see the website.


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research at IES

 

NASA to Kick Off Its Latest National Student Challenge at the 2021 ED Games Expo on June 1

The 8th Annual ED Games Expo will occur next week from June 1 to 5. The free event is all virtual, open to the public, and will showcase game-changing innovations in education technology developed through more than 40 programs at the Department of Education (ED) and across the federal government.

 

NASA National Student Challenge Event at the Ed Expo

One of many noteworthy Expo events will occur on Tuesday, June 1, from 6 to 8 PM Eastern when NASA’s Flight Opportunities program will introduce a new national student challenge. Educators can register to attend this LIVE event on June 1 here. The NASA TechRise Student Challenge will invite teams of sixth- to 12th-grade students to submit ideas for climate or remote sensing experiments to fly on a high-altitude balloon, and space exploration experiments to fly aboard a suborbital rocket.

 

 

NASA developed the NASA TechRise Student Challenge to enable students to have a deeper understanding of Earth’s atmosphere, space exploration, coding and electronics, and a broader understanding of the value of test data. The challenge will also provide students with the opportunity to engage with NASA and technology communities and expose them to careers in science, technology, and space exploration fields.

The challenge will begin accepting applications in August for student teams affiliated with U.S. public, private, and charter schools, including U.S. territories.  The winning teams each will receive $1,500 to build their payloads, as well as an assigned spot on a NASA-sponsored commercial suborbital flight. Balloon flights will offer more than four hours of flight time, while suborbital rockets will provide around three minutes of test time in microgravity conditions. The Flight Opportunities program, based at NASA’s Armstrong Flight Research Center, a part of Space Technology Mission Directorate, is leading the NASA TechRise Student Challenge. The challenge is being administered by California-based Future Engineers, which developed its platform with awards in 2016 and 2017 from the ED/IES SBIR program. Future Engineers’ platform has also been employed to manage past educational and NASA challenges, including the Name the Mars Rover student challenge. 

During the June 1 event, NASA experts will provide information to educators on the official competition. Teachers are invited to join a NASA TechRise Educator Summer Workshop, which will dive into the basics of electronics, coding, and designing for flight. The first workshop will be on Wednesday, July 28, 2021 and repeated on Wednesday, August 11, 2021. For challenge details and to pre-register for the competition, please visit the contest website.

 

More ED Games Expo Events to Engage Students in Hands-On Projects and Challenges

In addition to the NASA event, five more virtual events featuring government programs that engage students in project-based learning will occur on Monday, June 1 between 12:30PM to 6PM Eastern. Topics include students building and flying satellites, programs for museums, local communities and military facilities to engage students in experiential and real-world learning, and a program to inspire students to be inventors and entrepreneurs. See the Expo Agenda here for lineup of events and the ED Games Expo Playlists Page for video trailers by participating developers. 

 

We look forward to "seeing you" at the virtual ED Games Expo starting on June 1!


Edward Metz (Edward.Metz@ed.gov) is the Program Manager for the Small Business Innovation Research program at the US Department of Education’s Institute of Education Sciences.

 

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

Highlighting the Science of Learning at the 2021 ED Games Expo

Research on how people learn is critical for informing the design of effective education technology products. To design products that improve student learning, we need to understand how students approach solving problems, the information they need to adopt optimal solution strategies, the skills that underlie success in particular academic domains, the best ways to arrange information on a screen to guide student attention to relevant information, and the best study strategies for optimizing learning and retention. Through its research grants programs, IES has invested in research projects to develop and test education technology products based in the science of learning.

 

The 2021 ED Games Expo, which takes place virtually from June 1-5, features a number of these products, and you can learn more about them through a mix of live and pre-recorded sessions and videos. Most are ready to demo now. Students and educators can send questions about the research, game-play, or ed tech experience directly to the participating researchers. Here are just a few of the many ways you can interact with IES-funded researchers at the Expo:

 

  1. The Virtual Learning Lab (VLL) will be hosting a live, virtual session on Friday, June 4th from 4:00-5:30pm Eastern Time to celebrate their 5-year research collaboration to explore precision education in the context of algebra instruction. The VLL developed an AI-powered video recommendation system that personalizes math instruction within Math Nation. The researchers measured student ability and engagement, detected effects of virtual learning environment usage on achievement, and identified characteristics of effective online tutoring. The session will feature short talks and opportunities for Q&A: 
  • A Video Recommendation System for Algebra (Walter Leite, University of Florida)
  • Reinforcement Learning for Enhancing Collaborative Problem Solving (Guojing Zhou, University of Colorado Boulder)
  • Natural Language Processing for VLE Research (Danielle McNamara, Arizona State University)
  • Identifying Pedagogical Conversational Patterns in Online Algebra Learning (Jinnie Shin, University of Florida)
  • Scaling Items and Persons for Obtaining Ability Estimates in VLEs (A. Corinne Huggins-Manley, University of Florida)
  • Measuring Student Ability from Single- and Multiple-Attempt Practice Assessments in VLEs (Ziying Li, University of Florida)
  • Detecting Careless Responding to Assessment Items in VLEs (Sanaz Nazari, University of Florida)
  • Personalization, Content Exposure, and Fairness in Assessment (Daniel Katz, University of California, Santa Barbara)
  • Fair AI in VLEs (Chenglu Li, University of Florida)

 

  1. The ED Games Expo YouTube Playlist, which will be posted on June 1st on the event page, features 26 products developed with IES grant funding. For ed tech products informed by research on how people learn, check out the products funded through the Cognition and Student Learning topic:
  • Graspable Math allows math teachers to assign interactive algebra tasks and turns equations into tangible objects that middle school and high school students can manipulate to practice and explore. Teachers can follow live, step-by-step, student work.
  • eBravo Boulder Reading Intervention is a self-paced personalized reading comprehension curriculum that teaches secondary students the problem-solving skills good readers use to learn from challenging texts, in this case in the science discipline of ecology. Reading strategies and exercises are guided by well-researched models of reading comprehension, helping students build deep, durable, and reusable knowledge from text. 
  • iSTART and Writing Pal are interventions designed for middle school students, high school students, and young adults improve their reading and writing skills. Within these interventions, students play games to practice reading comprehension and writing strategies.
  • All You Can Eat, Gwakkamole, and CrushStations are part of a suite of Executive Function skill-building games, designed to improve student shifting, inhibitory control, and working memory respectively.

 

We hope you can join us for this exciting event in June to learn more about and try out all the research-based products ready to be used in our nation’s schools. For more information on the featured resources and online events, please see this blog.


Written by Erin Higgins (Erin.Higgins@ed.gov), Program Officer for the Cognition and Student Learning program, National Center for Education Research

Spotlight on School-based Mental Health

This May as we recognize National Mental Health Awareness Month, schools around the country are welcoming students and educators back for in-person instruction after more than a year of remote or hybrid teaching and learning. One issue schools must consider during this transition back is the increase in mental health concerns among adults, young adults, and adolescents this past year. Here at IES, we support research that explores, develops, and tests innovative, field-initiated approaches to support mental health in schools and classrooms. This new IES blog series will explore school-based mental health by looking at IES-funded research that helps answer the five Ws:

 

  • Why school-based mental health? The first blog in the series will consider the benefits of school-based mental health such as providing increased access to services, especially for children of color, and potentially counteracting the stigma some associate with mental health treatment.

 

  • What can schools do to support the mental health of their students and staff? The second blog in the series will highlight several projects that are developing innovative new ways to provide mental health services in school settings.

 

  • When during the school day can schools implement these mental health practices so that they do not compete with the academic/instructional goals of school? The third blog in the series will highlight a variety of projects that delve into the implementation challenges inherent to providing school-based mental health services and support.

 

  • Who in the school should implement these mental health practices? The fourth blog in the series will explore the critical scale up challenge for schools of having staff with adequate time who can be appropriately trained to provide mental health supports to students.

 

  • Where can these mental health practices be implemented? The final blog in the series will investigate the implementation challenges of different education settings (PreK, elementary, middle, high school, postsecondary) for school-based mental health programs and practices.  

 

See these blogs for more information about some of the school-based mental health research supported through the two IES research centers, the National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER).  


Written by Emily Doolittle (Emily.Doolittle@ed.gov), NCER Team Lead for Social Behavioral Research at IES