IES Blog

Institute of Education Sciences

Five ED/IES SBIR Companies Win National Industry Awards for Innovation

The U.S. Department of Education’s Small Business Innovation Research (SBIR) program at the Institute of Education Sciences (ED/IES SBIR) has served as a catalyst for the research and development of innovative technology that seeks to transform how and where students learn.

In 2016, the program continues to be recognized for spurring innovation, with five companies winning national awards and recognition for their ED/IES SBIR-developed technologies.

In June, Strange Loop Games’ Eco won the Climate Change Challenge at the Games for Change Festival in New York City. Eco is a multi-player environment where students collectively work to build a virtual ecosystem. The game provides students the opportunity to see how individual and collective decisions and actions affect their environment and climate.

In May, mtelegence’s Readorium won the Best Reading/English/Language Arts Solution through the Software & Information Industry Association’s (SIIA) CODiE program. Readorium is a web-based intervention that provides engaging content and games to middle school students to improve reading comprehension of science content.

Science4us, in May, won the best Science Instructional Solution through the SIIA CODiE program, and won THE Best Science Program through the BESSIE awards in April. Science4Us is a web-based game and simulation platform that provides foundational science learning opportunities for students in Kindergarten through Grade 2.

Also in May, Electric Funstuff’s Mission US won the Website Gold from the Parents' Choice Awards. In 2016, Mission US was a finalist for three other awards, including Best Learning Game at Games For Change, Outstanding Interactive Series through the Daytime Emmy Awards, and Best Web Game through the Webby Awards.  Mission US, which is partially funded by ED/IES SBIR, is a series of tablet-based interactive role-playing game that immerses 5th through 9th grade students in history.

In February, Querium was recognized as one of the 10 Most Innovative Education Technology Companies of 2016 by Fast Company Magazine. Querium is developing the Stepwise Virtual Tutor, which is a mobile and desktop virtual tutor that provides real-time assessments and support to middle and high school students in Algebra.

For information on more ED/IES SBIR supported companies that have won awards and been recognized for innovation in technology, check out the program’s News Archive. Stay tuned for updates on ED/IES SBIR on Twitter and Facebook.

About ED/IES SBIR: The Small Business Innovation Research program at the Department of Education’s Institute of Education Sciences funds firms and partners to develop commercially viable technology products to improve student learning or teacher practice in regular and special education.  ED/IES SBIR emphasizes rigorous research to inform the development process and to evaluate whether products show promise for delivering on the intended outcomes.

Spend Five Minutes Getting to Know IES

By Dana Tofig, Communications Director, IES
 
At the Institute of Education Sciences, we sometimes describe ourselves as the country’s “engine” that powers high-quality education statistics, research, and evaluation, or as the “infrastructure” that supports a steady supply of scientific evidence in education.  
 
But many users of IES resources are familiar with just a small slice of what we sponsor to provide quality evidence in education and support for its use across the country. While they may have heard of the National Assessment of Educational Progress (NAEP), the What Works Clearinghouse, or the ERIC database of research, studies, and periodicals, they may not know that all those programs, and many more, are housed under one roof at IES. 
 
To help people better understand our work and see how it is connected, we have developed a new video that gives an overview of IES and the six broad types of work that we do.  The video runs just under five minutes, so it doesn’t touch on everything, but it does give a good introduction to IES and our work to connect research to practice. 
 

Please share the video with friends and colleagues who might be interested in the work of IES. In the coming months, we will release additional videos that delve further into each of our focus areas.
 
This video is part of our ongoing efforts to ramp up our communication and dissemination efforts, including the launch of a new, mobile friendly website design and an IES Facebook page where you can get information about the latest reports, resources, and grant opportunities. In the fall, IES will also launch a new What Works Clearinghouse website, which will include an improved "Find What Works" tool. This will make it easier for educators to search for and compare the research about the effectiveness of interventions in education.
 
We are here to serve the public – and we always want to get better at what we do! If you have thoughts or ideas for how we can improve our communication and dissemination efforts, please send an email to dana.tofig@ed.gov.

Five Reasons to Visit the What Works Clearinghouse

By Diana McCallum, Education Research Analyst, What Works Clearinghouse

It’s been more than a decade since the first What Works Clearinghouse reports were released and we have a wealth of information and resources that can help educators and leaders make evidence-based decisions about teaching and learning. Since 2005, the WWC has assessed more than 11,500 education studies using rigorous standards and has published hundreds of resources and guides across many content areas. (View the full version of the graphic to the right.) 

The WWC website has already received more than 1.7 million page views this year, but if you haven’t visited whatworks.ed.gov lately, here are five reasons you might want to click over:

1) We are always adding new and updated reviews. Multiple claims about programs that work can be overwhelming and people often lack time to sift through piles of research. That’s where the WWC comes in. We provide an independent, objective assessment of education research. For example, we have intervention reports that provide summaries of all of the existing research on a given program or practice that educators can use to help inform their choices.  In addition, when a new education study grabs headlines, the WWC develops a quick review that provides our take on the evidence presented to let you know whether the study is credible. In 2015, we added 43 publications to WWC and we’re adding more every month this year.

2) We’ve expanded our reach into the Postsecondary area. In late 2012, the WWC expanded its focus to include reviews of studies within the Postsecondary area to capture the emerging research on studies on a range of topics, from the transition to college to those that focus on postsecondary success.  To date, the WWC has reviewed over 200 studies on postsecondary programs and interventions, and this area continues grow rapidly. In fact, several Office of Postsecondary Education grant competitions add competitive priority preference points for applicants that submit studies that meet WWC standards. (Keep an eye out for a blog post on the postsecondary topic coming soon!)

3) You can find what works using our online tool. Wondering how to get started with so many resources at your fingertips? Find What Works lets you do a quick comparison of interventions for different subjects, grades, and student populations. Want to know more about a specific intervention? We’ve produced more than 400 intervention reports to provide you the evidence about a curriculum, program, software product, or other intervention for your classroom before you choose it.  Recently, we’ve added a feature that allows a user to search for interventions that have worked for different populations of students and in different geographic locations. As we mentioned in a recent blog post, the Find What Works tool is undergoing an even bigger transformation this September, so keep visiting!

4) We identify evidence-based practices to use in the classroom. The WWC has produced 19 practice guides that feature practical recommendations and instructional tips to help educators address common challenges. Practice guides (now available for download as ebooks) provide quick, actionable guidance for educators that are supported by evidence and expert knowledge within key areas.  Some of our guides now feature accompanying videos and brief summaries that demonstrate recommended practices and highlight the meaning behind the levels of evidence. The work of practice guides are also actively disseminated during Regional Educational Laboratory (REL) Bridge events. For instance, REL Southwest held a webinar on Teaching Math to Young Children, which was based on a WWC practice guide. For more information, read a previously published blog post on practice guides.

5) We compile information by topic. Our “Special Features” pages focus on common themes in education, such as tips for college readiness, information for heading back to school, and guidance for what works in early childhood education. These Special Features provide a starting point to access a variety of WWC resources related to a topic.

In the coming months, we’ll post other blogs that will explore different parts of the WWC and tell you about ongoing improvements. So keep visiting the What Works website or signup to receive emails when we release new reports or resources. You can also follow us on Facebook and Twitter.

The What Works Clearinghouse is a part of the National Center for Education Evaluation and Regional Assistance in the Institute of Education Sciences (IES), the independent research, evaluation, and statistics arm of the U.S. Department of Education. You can learn more about IES’ other work on its website or follow IES on Twitter and Facebook

 

Statistical Concepts in Brief: Embracing the Errors

By Lauren Musu-Gillette

EDITOR’S NOTE: This is part of a series of blog posts about statistical concepts that NCES uses as a part of its work.

Many of the important findings in NCES reports are based on data gathered from samples of the U.S. population. These sample surveys provide an estimate of what data would look like if the full population had participated in the survey, but at a great savings in both time and costs.  However, because the entire population is not included, there is always some degree of uncertainty associated with an estimate from a sample survey. For those using the data, knowing the size of this uncertainty is important both in terms of evaluating the reliability of an estimate as well as in statistical testing to determine whether two estimates are significantly different from one another.

NCES reports standard errors for all data from sample surveys. In addition to providing these values to the public, NCES uses them for statistical testing purposes. Within annual reports such as the Condition of Education, Indicators of School Crime and Safety, and Trends in High School Drop Out and Completion Rates in the United States, NCES uses statistical testing to determine whether estimates for certain groups are statistically significantly different from one another. Specific language is tied to the results of these tests. For example, in comparing male and female employment rates in the Condition of Education, the indicator states that the overall employment rate for young males 20 to 24 years old was higher than the rate for young females 20 to 24 years old (72 vs. 66 percent) in 2014. Use of the term “higher” indicates that statistical testing was performed to compare these two groups and the results were statistically significant.

If differences between groups are not statistically significant, NCES uses the phrases “no measurable differences” or “no statistically significant differences at the .05 level”. This is because we do not know for certain that differences do not exist at the population level, just that our statistical tests of the available data were unable to detect differences. This could be because there is in fact no difference, but it could also be due to other reasons, such as a small sample size or large standard errors for a particular group. Heterogeneity, or large amounts of variability, within a sample can also contribute to larger standard errors.

Some of the populations of interest to education stakeholders are quite small, for example, Pacific Islander or American Indian/Alaska Native students. As a consequence, these groups are typically represented by relatively small samples, and their estimates are often less precise than those of larger groups. These less precise estimates can often be reflected in larger standard errors for these groups. For example, in the table above the standard error for White students who reported having been in 0 physical fights anywhere is 0.70 whereas the standard error is 4.95 for Pacific Islander students and 7.39 for American Indian/Alaska Native students. This means that the uncertainty around the estimates for Pacific Islander and American Indian/Alaska Native students is much larger than it is for White students. Because of these larger standard errors, differences between these groups that may seem large may not be statistically significantly different. When this occurs, NCES analysts may state that large apparent differences are not statistically significant. NCES data users can use standard errors to help make valid comparisons using the data that we release to the public.

Another example of how standard errors can impact whether or not sample differences are statistically significant can be seen when comparing NAEP scores changes by state. Between 2013 and 2015, mathematics scores changed by 3 points between for fourth-grade public school students in Mississippi and Louisiana. However, this change was only significant for Mississippi. This is because the standard error for the change in scale scores for Mississippi was 1.2, whereas the standard error for Louisiana was 1.6. The larger standard error, and therefore larger degree of uncertainly around the estimate, factor into the statistical tests that determine whether a difference is statistically significant. This difference in standard errors could reflect the size of the samples in Mississippi and Louisiana, or other factors such as the degree to which the assessed students are representative of the population of their respective states. 

Researchers may also be interested in using standard errors to compute confidence intervals for an estimate. Stay tuned for a future blog where we’ll outline why researchers may want to do this and how it can be accomplished.

RCT-YES: Supporting a Culture of Research Use in Education

By Ruth Curran Neild, Delegated Director, IES

The mission of the Institute of Education Sciences (IES), at its core, is to create a culture in which independent, rigorous research and statistics are used to improve education. But sometimes research is seen by practitioners and policymakers as something that is done for them or to them, but not by them. And that’s something we’re hoping to change.

IES is always looking for new ways to involve educators in producing and learning about high-quality, useful research. We believe that if state and school district staff see themselves as full participants in scientific investigation, they will be more likely to make research a part of their routine practice. Simply put, we want to make it easier for educators to learn what works in their context and to contribute to the general knowledge of effective practices in education.    

That’s why we’re so pleased to add the RCT-YESTM software to the IES-funded toolkit of free, user-friendly resources for conducting research. Peter Schochet of Mathematica Policy Research, Inc. led the development of the software, as part of a contract with IES held by Decision Information Resources, Inc.

RCT-YES has a straightforward interface that allows the user to specify the analyses for data from a randomized controlled trial (RCT) or a quasi-experiment. Definitions and tips in the software help guide the user and accompanying documentation includes a mini-course on RCTs. When the user enters information about the data set and study design, RCT-YES produces a program to run the specified analyses (in either R or Stata) and provide a set of formatted tables.

The target users are those who have a basic knowledge of statistics and research design but do not have advanced training in conducting or analyzing data from impact studies. But we expect that even experienced researchers will like the simplicity and convenience of RCT-YES and benefit from some of its novel features, such as how it reports results.

When used properly, RCT-YES provides all of the statistics needed by the What Works ClearinghouseTM (WWC) to conduct a study review.  This is an important feature because the WWC often needs to contact authors—even experienced ones—to obtain additional statistics to make a determination of study quality.  RCT-YES could help advance the field by increasing the completeness of study reports.

Another unique feature of the software is that it defaults to practices recommended by IES’ National Center for Education Statistics for the protection of personally identifiable information. For example, the program suppresses reporting on small-size subgroups.

While the user sees only the simplicity of the interface, the underlying estimation methods and code required painstaking and sophisticated work.  RCT-YES relies on design-based estimation methods, and the development, articulation, peer review, and publication of this approach in the context of RCT-YES was the first careful step. Design-based methods make fewer assumptions about the statistical model than methods traditionally used in education (such as hierarchical linear modeling), making this approach especially appropriate for software designed with educators in mind.

The software is available for download from the RCT-YES website, where you can also find support videos, documentation, a user guide, and links to other helpful resources. The videos below, which are also hosted on the RCT-YES website, give a quick overview of the software.

There are many other ways that IES fosters a culture of research use in education. For instance, our 10 Regional Educational Laboratories (RELs) have research alliances that work with states and districts to develop research agendas. The RELs also host events to share best practices for putting research into action, such as the year-long series of webinars and training sessions on building, implementing, and effectively using Early Warning Systems to reduce dropping out.

IES also offers grants to states and districts to do quick evaluations of programs and policies that have been implemented in their schools. The low-cost, short-duration evaluations not only help the grantees discover what is working, but can help others who might use the same program or implement a similar policy. (We’ll announce the first round of grant recipients in the coming weeks).

Visit the IES website to learn more about our work. You can also stay on top of news and information from IES by following us on Facebook and Twitter