IES Blog

Institute of Education Sciences

Gathering Input on Language and Communication Research and Development

Human interaction in society depends upon language and communication and the Institute of Education Sciences is one of several federal agencies that supports research and development (R&D) activities to further our knowledge in this area. 

High school students sitting in a circle talking.

However, so far, there has been no systematic accounting or description of the range of language and communication R & D that the Federal Government supports.To address this gap, the White House Office of Science and Technology Policy’s National Science and Technology Council (NSTC) convened the Federal Government’s Interagency Working Group on Language and Communication. Led by co-chairs from the Department of Education and the Department of Defense, representatives from 13 different federal agencies developed a report of current and recent federal investments in language and communication R & D activities.

This investment is discussed across four broad areas:

  • Knowledge and Processes Underlying Language and Communication;
  • Language and Communication Abilities and Skills;
  • Using Language and Communication; and
  • Language and Communication Technologies.

In addition, the report describes the types of current R & D activities in these areas, and provides programmatic recommendations for key areas of investment and collaboration in language and communication research going forward. 

On behalf of the working group, IES is gathering information from a wide community interested in language and communication R & D through a recently released request for information (RFI).  The purpose of this RFI is to assist the working group in its efforts to further improve coordination and collaboration of R & D agendas related to language and communication across the Federal Government. If you are interested in submitting a response to the RFI, please do so by the deadline of December 30, 2016.

Written by Elizabeth Albro, Associate Commissioner of Teaching and Learning, National Center for Education Research

Trading the Number 2 Pencil for 2.0 Technology

Although traditional pencil and paper tests provide good information for many purposes, technology presents the opportunity to assess students on tasks that better elicit the real world skills called for by college and career standards. IES supports a number of researchers and developers who are using technology to develop better assessments through grants as well as the Small Business Innovation Research program. 

A screenshot of the GISA assessment intro page One example of the power of technology to support innovative assessment  is the Global, Integrated, Scenario-based Assessment (known as ‘GISA’), developed by John Sabatini and Tenaha O’Reilly at the Educational Testing Service as part of a grant supported by the Reading for Understanding Research Initiative.

Each GISA scenario is structured to resemble a timely, real-world situation. For example, one scenario begins by explaining that the class has been asked to create a website on green schools. The student is assigned the task of working with several students (represented by computer avatars) to create the website. In working through the scenario, the student engages in activities that are scaffolded to support students in summarizing information, completing a graphic organizer, and collaborating to evaluate whether statements are facts or opinions. The scenario provides a measure of each student’s ability to learn from text through assessing his or her knowledge of green schools before and after completing the scenario. This scenario is available on the ETS website along with more information about the principles on which GISA was built.

Karen Douglas, of the National Center for Education Research, recently spoke to Dr. Sabatini and Dr. O’Reilly on the role of technology in creating GISA, what users think of it, and their plans for continuing to develop technology-based assessments.

How did the use of technology contribute to the design of GISA?

Technological delivery creates many opportunities over more traditional paper and pencil test designs. On the efficiency side of the argument, items and tasks can be delivered over the internet in a standardized way and there are obvious advantages for automated scoring. However, the real advantage has to do with both the control over test environment and what can be assessed. We can more effectively simulate the digital environments that students use in school, leisure and, later, in the workforce. GISA uses scenario-based assessment to deliver items and tasks. During a scenario-based assessment students are given a plausible reason for reading a collection of thematically related materials. The purpose defines what is important to focus on as students work towards a larger goal. The materials are diverse and may reflect different perspectives and quality of information. 

Screenshot of a GISA forum on Green Schools

The student not only needs to understand these materials but also needs to evaluate and integrate them as they solve problems, make decisions, or apply what they learn to new situations. This design is not only more like the activities that occur in school, but also affords the opportunity for engaging students in deeper thinking. GISA also includes simulated students that may support or scaffold the test taker's understanding with good habits of mind such as the use of reading strategies. Items are sequenced to build up test takers’ understanding and to examine what parts of a more complex task students can or cannot do. In this way, the assessment serves as a model for learning while simultaneously assessing reading. Traditionally, the areas of instruction and assessment have not been integrated in a seamless manner.

What evidence do you have that GISA provides useful information about reading skills?

We have a lot more research to conduct, but thus far we have been able to create a new technology- delivered assessment that updates the aspects of reading that are measured and introduces a variety of new features.

Despite the novel interface, items, tasks, and format, students are able to understand what is expected of them. Our analyses indicate the test properties are good and that students can do a range of tasks that were previously untested in traditional assessments. While students may be developing their skills on more complex tasks, there is evidence they can do many of the components that feed into it. In this way the assessment may be more instructionally relevant.

Informally, we have received positive feedback on GISA from both teachers and students. Teachers view the assessment as better matching the types of activities they teach in the classroom, while students seem to enjoy the more realistic purpose for reading, the more relevant materials, and the use of simulated peers. 

What role do you think technology will play in future efforts to create better assessments?

We believe technology will play a greater role in how assessments are designed and delivered. Being able to provide feedback to students and better match the test to student needs are some areas where future assessments will drive innovation. More interactive formats, such as intelligent tutoring and gaming, will also grow over time. With new forms of technology available, the possibilities for meeting students’ educational needs increases dramatically.

What’s next for GISA?

We are using GISA in two additional grants. In one grant, we leverage the GISA designs for use with adults, a group for which there are few viable assessments. In the other grant we are using GISA to get a better understanding of how background knowledge affects reading comprehension.

For more information about the Reading for Understanding Research Initiative, read this post on the IES blog.

By Karen Douglas, Education Research Analyst, NCER, who oversees the Reading for Understanding Research Initiative

Sharing the Power of Intensive Interventions for Students with Learning Disabilities

In 2013, the National Center for Special Education Research (NCSER) launched the Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative (A3). The goal was to develop and evaluate intensive interventions—such as curricula, instructional approaches and technology—that could improve the academic achievement of students with or at risk of a disability.

A five-year grant in this initiative went to Dr. Douglas Fuchs and Dr. Lynn Fuchs (pictured), of Vanderbilt University’s Peabody College, who for the past three years have been developing and piloting intensive interventions focused on improving students’ reading comprehension of informational texts and fraction and pre-algebra performance.

Earlier this month, the Fuchs joined Dr. Lou Danielson and Dr. Rebecca Zumeta Edmonds from the National Center on Intensive Interventions (NCII) for a webinar: “Intensive Intervention: What is it, Who it’s For, and Why it’s Important?” (NCII is a research initiative funded by the U.S. Department of Education’s Office of Special Education Programs.)

The NCII/A3 webinar was purposely held in October—which is Learning Disabilities Awareness Month—to raise awareness of research and resources to support students with learning disabilities. The session was recorded and is available through the NCII website or you can watch it below.

The panelists discussed the intensive intervention process, methods of identifying students not making adequate academic progress, and recent related research. Specifically, the Fuchs’ shared their research designing and piloting two innovative components that seek to expand responsiveness to intervention:

  • Capitalizing on the power of prior knowledge to build informational text comprehension; and
  • Capitalizing on the power of executive function to build fractions knowledge

As part of this NCSER A3 Initiative, these and other intervention components are being developed, integrated into comprehensive intervention programs, and rigorously tested. Please visit the project website to learn more and keep up to date with the latest findings from this research. Viewers of the recorded webinar can also learn more about implementation support resources available through NCII.

In the final years of their five-year NCSER grant, Doug and Lynn Fuchs will work to understand the efficacy of intensive interventions for improving outcomes for students with learning disabilities.  

Written by Sarah Brasiel, Program Officer, NCSER

Photo by Wolf Hoffmann,courtesy of Vanderbilt University


 

An IES-funded “Must Read” on Writing and Reading Disabilities

A paper based on an IES-funded grant has been recognized as a “must read” by the Council for Learning Disabilities.

IES-funded researcher, Stephen Hooper, and his colleagues were recently recognized by the Council for their paper: Writing disabilities and reading disabilities in elementary school students: Rates of co-occurrence and cognitive burden (PDF). The paper was written by Lara-Jeane Costa, Crystal Edwards, and Dr. Hooper and published in Learning Disability Quarterly. Every year, the Council for Learning Disabilities acknowledges outstanding work published in its journals and selected this paper as one of two Must Read pieces for 2016. The authors will present on the paper at the Council's annual conference in San Antonio this week (October 13-14, 2016).

This paper was funded through a grant from the National Center for Education Research (NCER) to examine written language development and writing problems, and the efficacy of an intervention aimed at improving early writing skills. The results of the paper found that the rate of students with both writing and reading disabilities increased from first to fourth grade and these students showed lower ability in language, fine motor skills and memory compared with students with neither disability or only a writing disability.  

The team continues their IES-funded work by looking at the efficacy of the Self-Regulated Strategy Development intervention on struggling middle school writers’ academic outcomes.

Written by Becky McGill-Wilkinson, Education Research Analyst, NCER

Pinning Down the Use of Research in Education

There are plenty of great ideas to be found on Pinterest: recipes for no-bake, allergen-friendly cookies; tips for taking better photos; and suggestions for great vacation spots in Greece. Lots of teachers use Pinterest as a way to share classroom ideas and engaging lessons. But where do teachers, education leaders and decision makers turn when they need evidence-based instructional practices that may work to help struggling readers, or want to use research to address other educational challenges?

Since 2014, the National Center for Education Research (NCER) has funded two National Research and Development Centers on Knowledge Utilization in an effort to find out. They are well on their way to answering questions about how and why teachers and schools use—or do not use—research in their decision making. They are also exploring ways the research community can increase interest in and actual use of research-based practices.

We are beginning to see the first results of their efforts to answer two key questions. First, are educators and schools using education research in their decision making, and if they aren’t, why not? The second question is: If educators are not using evidence as a part of their work, what can the research community do to make it more likely they will?

The National Center for Research in Policy and Practice (NCRPP) was awarded to the University of Colorado Boulder and is led by Principal Investigator Bill Penuel (University of Colorado Boulder), and Co-Principal Investigators Derek Briggs (University of Colorado Boulder), Jon Fullerton (Harvard University), Heather Hill (Harvard University), Cynthia Coburn (Northwestern University), and Jim Spillane (Northwestern University).

NCRPP has recently released their first technical report which covers the descriptive results from their nationally-representative survey of school and district leaders. Results from the report show that school and district leaders do use research evidence for activities such as designing professional development, expanding their understanding of specific issues, or convincing others to agree with a particular point of view on an education issue. Instrumental uses of research, when district leaders apply research to guide or inform a specific decision, were most commonly reported. Overall, school and district leaders were positive about the relevance and value of research for practice. When asked to report what specific piece of research was most useful, school and district leaders named books, policy reports, and peer-reviewed journal articles. You can get more information on the center's website, http://ncrpp.org. They are also very active on Twitter.

The Center for Research Use in Education (CRUE) was awarded to the University of Delaware and is led by Principal Investigator Henry May (University of Delaware), and Co-Principal Investigator Liz Farley-Ripple (University of Delaware). This team is currently working on drafting their measures of research use, which will include a set of surveys for researchers and another set for practitioners. They are especially interested in understanding which factors contribute to deep engagement with research evidence, and how gaps in perceptions and values between researchers and practitioners may be associated with frequency of deep research use. You can learn more about the work of CRUE on their website, http://www.research4schools.org/ and follow them on Twitter

While the Centers were tasked with tapping into use of research evidence specifically, both are interested in understanding all sources of evidence that practitioners use, whether it’s from peer-reviewed research articles, the What Works Clearinghouse, a friend at another school, or even Pinterest. There is certainly a wealth of research evidence to support specific instructional practices and programs, and these two Centers will begin to provide answers to questions about how teachers and leaders are using this research.

So, it’s possible that, down the road, Pinterest will become a great place for homemade, toxic-free finger paint and evidence-based practices for improving education.

Written by Becky McGill-Wilkinson, NCER Program Officer for the Knowledge Utilization Research and Development Centers