IES Blog

Institute of Education Sciences

ED/IES SBIR Awards: Funding the Next Generation of Education Technology

Images of SBIR Phase II ProjectsFor more than a decade, the Department of Education’s Small Business Innovation Research program, operated out of the Institute of Education Sciences, has funded projects to develop education technology designed to support students, teachers, and administrators in general or special education. The program, known as ED/IES SBIR, also focuses on the commercialization after development is complete so that the products can reach schools and be sustained over time. It’s research, with a start-up mentality.

In recent years, millions of students in schools around the country have used technologies developed through ED/IES SBIR funding, such as products by Filament Games, Sokikom, Agile Mind, and Mindset Works, to name a few.

This week, IES announced 18 ED/IES SBIR program awards for 2017. Of these awards, 11 are Phase I awards to develop and test a prototype, and seven are Phase II awards to fully develop and evaluate an education technology product in classrooms and schools. (See a video playlist of Phase II projects below)  

The new awards cover topics across math, science, engineering, reading, support social and behavioral development, and several are building platforms to inform decision-making by teachers and administrators. Several projects are pairing software with hardware-based technologies, such as Virtual Reality, 3D-printing, and Wearables.

The new awards also continue to fund projects in two major categories – learning games and dashboards for teachers and administrators.

Learning Games

For the past seven years, about half of ED/IES SBIR awards have focused on the development and evaluation of learning games (click here for a playlist). Continuing that trend, more than half of the 2017 ED/IES SBIR awards are for game-based technologies. Examples include:

  • Phase II awardee Schell Games and Phase I awardee Electric Funstuff are building games for use with Virtual Reality headsets so that students can engage with academic content in immersive 360-degree environments;

  • Phase II awardee Parametric Studios is creating a “makerspace” engineering simulated environment with a 3D-printer;

  • Phase II awardee Fablevision is developing a fractions game with an adaptive component that auto-adjusts in difficulty to meet the competency level of individual students;

  • Phase II awardee Spry Fox is building in-game supports and using rewards and competition to drive game play in teaching vocabulary to struggling middle school students and English Learners; and

  • Phase I awardees MidSchoolMath and Happy People Games are using story-based narrative to engage students and apply learning, while Fokus Labs and Safe Toddles are creating prototypes employing wearable devices paired with a game component to improve performance.

Dashboards for Teachers and Administrators

Many of the newly funded projects are developing a dashboard component populated with data and information to generate reports that teachers and administrators can use to guide instruction and decision making. Examples include:

  • Phase II awardee Analytic Measures is developing an automated speech recognition technology to assess students’ oral fluency in real-time with a dashboard to provide reports to inform teacher instruction;

  • Phase II awardee Future Engineers is developing an open online platform that generates lists of engineering and maker-based projects for students in K-12 classrooms.

  • Phase I projects by Story World, Strange Loop Games, TutorGen, Simbulus, and Myriad Sensors are creating prototypes of dashboards to provide teachers formative assessment results on student performance with reports to guide instruction; and 

  • Two projects focus on platforms for schools administrators – a Phase II project by EdSurge to inform the selection process for technology for school improvement and a Phase I project by LiveSchool to generate reports on student behavior across classes and school.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Written by Edward Metz, program manager, ED/IES SBIR

 

IES at the AERA Annual Meeting

By Dana Tofig, Communications Director, IES

The American Educational Research Association (AERA) will hold its annual meeting April 27 through May 1 in San Antonio, Texas. This is one of the nation’s largest educational research conferences and the Institute of Education Sciences (IES) will be well represented.

More than 100 sessions at the AERA meeting will feature IES staff or work supported by IES. Below is a brief overview, including links to lists of sessions. You can also follow us on Twitter and Facebook to read our #AERA17 posts. 

IES Staff at AERA

IES staff will participate in 20 different presentations, symposia, roundtables, and professional development sessions during the conference, providing information and insight about the wide range of work that we do.

One highlight will be on Sunday, April 30, 10:35 a.m. CT), during a session entitled Research Statistics, and Data: The Vital Role of the Institute of Education Sciences in Retrospect and Prospect. At the session, Thomas Brock, Commissioner of the National Center for Education Research (delegated the duties of IES director), and Peggy Carr, Acting Commissioner of the National Center for Education Statistics, will be a part of a panel that will discuss the work IES has done over the past 15 years and what the work looks like going forward. They will be joined by other researchers and experts, including Northwestern University’s Larry Hedges, currently the Chair of the National Board for Education Sciences. This session will can be viewed for free via livestream, but you must register in advance.

Other presentations led by or featuring IES staff include sessions about funding opportunities and how to write an application for an IES grant; accessing and using data from NCES and the National Assessment of Educational Progress (NAEP); understanding and using international education data, including assessment results; and conversations about different data collections, including race and gender diversity, school-level finance, socioeconomic status and more.

Click here to see a full list of IES staff presentations at AERA.

IES-funded Work at AERA

More than 80 sessions at AERA will feature research and programs that were supported by IES grants and other funding sources. These presentations will cover a wide range of topics, from early childhood education to K-12 to postsecondary opportunities and beyond.

Many of our grantees will present findings from IES-funded research, including the results of IES Research and Development Centers, such as the National Center for Research on Gifted Education, the Center on Scaling Up Effective Schools, the Center for the Study of Adult Literacy, and the Center on Standards, Alignment, Instruction, and Learning.

IES-funded grants will be featured at several other sessions, including eight presentations that will present findings from our Cognition and Student Learning grant program, which builds understanding of how the mind works to inform and improve education practice in reading, writing, mathematics, science, and study skills.

In addition, researchers from several of the Regional Educational Laboratories will present findings on a variety of topics, including early education quality, English learners, teacher evaluations and mobility, college readiness, virtual learning, and much more. 

Also, the National Center for Research in Policy and Practice, an IES-funded Knowledge Utilization Center, will hold several sessions about what they have learned about how educators and policy makers access and use evidence in their decision making.

Click here to see a list of presentations on IES-funded research and programs. 

 

Get to Know NCES in Just Five Minutes!

By Lauren Musu-Gillette

Have you ever read one of our reports and wondered where the data came from? Are you familiar with NAEP, but have never heard of IPEDS? Are you curious about the history of NCES? If so, our new video is perfect for you!

The full scope of NCES activities can be daunting for those not familiar with the Center. Our data collections include samples from early childhood to postsecondary education, and cover such diverse topics as math and reading achievement, the experiences of teachers and principals, and school crime. In addition, the Center has a rich history both within the Department of Education and as a federal statistical agency. To make our data, reports, and tools more accessible to the public, we’ve created a new video to help introduce you to who we are and what we do.

To learn more about the Center’s work, watch the video below and follow us on Twitter and Facebook.

Using the WWC as a Teaching Tool

EDITOR'S NOTE:The What Works Clearinghouse (WWC), a program of the Institute of Education Sciences, is a trusted source of scientific evidence on education programs, products, practices, and policies. The WWC also has many tools and resources for education researchers and students.  In this guest blog post, Jessaca Spybrook (pictured, below right), Associate Professor of Evaluation, Measurement and Research at Western Michigan University, discusses how she uses WWC procedures and standards as a teaching tool.


By Jessaca Spybrook, Western Michigan University

TraiJessaca Spybrookning the next generation of researchers so they are prepared to enter the world of education research is a critical part of my role as a faculty member in the Evaluation, Measurement, and Research program. I want to ensure that my students have important technical skills in a host of subject areas including, but not limited to, research design, statistics, and measurement. At the same time, I want to be sure they know how to apply the skills to design and analyze real-world studies. I often struggle to find resources for my classes that help me meet both goals.

One resource that has emerged as an important tool in meeting both goals is the What Works Clearinghouse website. I frequently integrate materials from the WWC into the graduate research design and statistics courses I teach.

For example, in a recent class I taught, Design of Experiments and Quasi-Experiments, I used the WWC Procedures and Standards Handbook Version 3.0 throughout (an image from the publication is pictured below). The Handbook met four important criteria as I was selecting resources for my class:

  1. Inclusion of important technical detail on design and analysis;
  2. Up-to-date and current thinking and “best practice” in design and analysis;
  3. Clear writing that is accessible for graduate students; and
  4. It was free (always a bonus when searching for class materials).Image from the What Works Clearinghouse Standards & Practices Guide 3.0

By no means did the Handbook replace classic and well-regarded textbooks in the class. Rather, it helped connect classic texts on design to both recent advances related to design, as well as real-life considerations and standards that designs are judged against.

At the end of my class, students may have been tired of hearing the question, “what is the highest potential rating for this study?” But I feel confident that using the WWC Handbook helped me not only prepare graduates with the technical know-how they need to design a rigorous experiment or quasi-experiment, but also raised awareness of current best practice and how to design a study that meets important standards set for the field.

 

Building Evidence: What Comes After an Efficacy Study?

Over the years, the Institute of Education Sciences (IES) has funded over 300 studies across its research programs that evaluate the efficacy of specific programs, policies, or practices. This work has contributed significantly to our understanding of the interventions that improve outcomes for students under tightly controlled or ideal conditions. But is this information enough to inform policymakers’ and practitioners’ decisions about whether to adopt an intervention? If not, what should come after an efficacy study?

In October 2016, IES convened a group of experts for a Technical Working Group (TWG) meeting to discuss next steps in building the evidence base after an initial efficacy study, and the specific challenges that are associated with this work. TWGs are meant to encourage stakeholders to discuss the state of research on a topic and/or to identify gaps in research.  

Part of this discussion focused on replication studies and the critical role they play in the evidence-building process. Replication studies are essential for verifying the results of a previous efficacy study and for determining whether interventions are effective when certain aspects of the original study design are altered (for example, testing an intervention with a different population of students). IES has supported replication research since its inception, but there was general consensus that more replications are needed.

TWG participants discussed some of the barriers that may be discouraging researchers from doing this work. One major obstacle is the idea that replication research is somehow less valuable than novel research—a bias that could be limiting the number of replication studies that are funded and published. A related concern is that the field of education lacks a clear framework for conceptualizing and conducting replication studies in ways that advance evidence about beneficial programs, policies and practices (see another recent IES blog post on the topic).

IES provides support for studies to examine the effectiveness of interventions that have prior evidence of efficacy and that are implemented as part of the routine and everyday practice occurring in schools without special support from researchers. However, IES has funded a relatively small number of these studies (14 across both Research Centers). TWG participants discussed possible reasons for this and pointed out several challenges related to replicating interventions under routine conditions in authentic education settings. For instance, certain school-level decisions can pose challenges for conducting high-quality effectiveness studies, such as restricting the length that interventions or professional development can be provided and choosing to offer the intervention to students in the comparison condition. These challenges can result in findings that are influenced more by contextual factors rather than the intervention itself. TWG participants also noted that there is not much demand for this level of evidence, as the distinction between evidence of effectiveness and evidence of efficacy may not be recognized as important by decision-makers in schools and districts.

In light of these challenges, TWG participants offered suggestions for what IES could do to further support the advancement of evidence beyond an efficacy study. Some of these recommendations were more technical and focused on changes or clarifications to IES requirements and guidance for specific types of research grants. Other suggestions included:

  • Prioritizing and increasing funding for replication research;
  • Making it clear which IES-funded evaluations are replication studies on the IES website;
  • Encouraging communication and partnerships between researchers and education leaders to increase the appreciation and demand for evidence of effectiveness for important programs, practices, and policies; and
  • Supporting researchers in conducting effectiveness studies to better understand what works for whom and under what conditions, by offering incentives to conduct this work and encouraging continuous improvement.

TWG participants also recommended ways IES could leverage its training programs to promote the knowledge, skills, and habits that researchers need to build an evidence base. For example, IES could emphasize the importance of training in designing and implementing studies to develop and test interventions; create opportunities for postdoctoral fellows and early career researchers to conduct replications; and develop consortiums of institutions to train doctoral students to conduct efficacy, replication, and effectiveness research in ways that will build the evidence base on education interventions that improve student outcomes.

To read a full summary of this TWG discussion, visit the Technical Working Group website or click here to go directly to the report (PDF).

Written by Katie Taylor, National Center for Special Education Research, and Emily Doolittle, National Center for Education Research