IES Blog

Institute of Education Sciences

Improving Research on the Forgotten ‘R’

Writing is often labeled as the “forgotten ‘R,’” because the other R’s—reading and ‘rithmetic—seem to garner so much attention from educators, policymakers, and researchers. Yet, we know writing is a critical skill for communication and for success in school and in career. Writing in middle and high school can be especially important, because secondary grades are where students are expected to have mastered foundational skills like handwriting and move on to the application of these skills to more complex compositions.

IES has been funding research on writing since its inception in 2002, but compared to research on reading, not much work has been done in this critical area, especially writing in middle and high schools. In an effort to learn more about the state of the field of writing in secondary schools and the areas of needed research, IES brought together 13 experts on secondary writing for a Technical Working Group (TWG) meeting in September. During the full-day meeting, TWG participants shared their thoughts and expertise on a variety of topics including: argumentative writing, methods of engaging adolescents in writing, how best to help struggling writers including English learners and students with or at risk for disabilities, and assessment and feedback on writing.

Argumentative writing requires students to explore a topic, collect and evaluate evidence, establish a position on a topic, and consider alternative positions. In middle and high schools, argumentative writing often occurs in content area classrooms like science and history. TWG participants discussed the importance of research to understand how argumentative writing develops over time and how teachers contribute to this development.

Teaching writing to students with or at risk for disabilities and English learners can be challenging when the focus of secondary schools is often on content acquisition and not on improving writing skills. English learners are typically grouped together and receive the same instruction, but little is known about how writing instruction may need to be differentiated for students from different language backgrounds. Additionally, the TWG participants discussed the need to investigate the potential for technology to help with instruction of students who struggle with writing, and the importance to addressing the negative experiences these students have with writing that may discourage them from writing in the future.

It is also important to make sure all students are engaged and motivated to write. Some middle and high school students  may not want to participate in writing or may have internalized beliefs that they are not good at it. TWG participants discussed the need to consider teaching students that writing abilities can be changed, and that introducing new audiences or purposes for writing may motivate students to write. Finally, the group talked about the importance of allowing middle and high school students to write about topics of their own choosing.

Assessing the writing quality of middle and high school students is difficult, because what counts as good writing is often subjective. Technology may offer some solutions, but TWG participants emphasized that it is unlikely that computers will be able to do this task well entirely on their own. Regardless, the TWG participants were in agreement that there is a need for the development of quality writing measures for use both by teachers and by researchers.  Teachers may feel pressure to provide detailed feedback on students’ writing, which can be time-consuming. TWG participants argued that self-assessment and peer feedback could relieve some of the pressure on teachers, but research is needed to understand what kind of feedback is best for improving writing and how to teach students to provide useful feedback.

A full summary of the TWG can be found on the IES website. It’s our hope this conversation provides a strong framework for more research on ‘the forgotten R.’

POSTSCRIPT: Our colleagues at the What Works Clearinghouse recently published an Educator’s Practice Guide, “Teaching Secondary Students to Write Effectively.” It includes three research-based recommendations for improving writing for middle and high school students.

Written by Becky McGill-Wilkinson, National Center for Education Research, and Sarah Brasiel, National Center for Special Education Research

The 2016 PI Meeting: Making it Matter

Hundreds of researchers, practitioners, and education scientists gathered in Washington D.C. for the 2016 IES Principal Investigators (PI) Meeting on December 15 & 16. 

The annual meeting provided an opportunity for attendees to share the latest findings from their IES-funded work, learn from one another, and discuss IES and U.S. Department of Education priorities and programs.

The theme of this year’s annual meeting was Making it Matter: Rigorous Research from Design to Dissemination and the agenda included scores of session that highlighted findings, products, methodological approaches, new projects, and dissemination and communication strategies. The meeting was organized by the two IES research centers—the National Center for Education Research and the National Center for Special Education Research—in collaboration with the three meeting co-chairs: Roberta Michnick Golinkoff, of the University of Delaware; Kathleen Lynne Lane, of the University of Kansas; and Grace Wardhana, CEO of Kiko Labs.

Attendees were active on Twitter, using the hashtag #IESPImtg. Several attendees took the opportunity to highlight why their research matters using a sign and a selfie stick. Below are some Twitter highlights of the 2016 PI meeting.  

 

Gathering Input on Language and Communication Research and Development

Human interaction in society depends upon language and communication and the Institute of Education Sciences is one of several federal agencies that supports research and development (R&D) activities to further our knowledge in this area. 

High school students sitting in a circle talking.

However, so far, there has been no systematic accounting or description of the range of language and communication R & D that the Federal Government supports.To address this gap, the White House Office of Science and Technology Policy’s National Science and Technology Council (NSTC) convened the Federal Government’s Interagency Working Group on Language and Communication. Led by co-chairs from the Department of Education and the Department of Defense, representatives from 13 different federal agencies developed a report of current and recent federal investments in language and communication R & D activities.

This investment is discussed across four broad areas:

  • Knowledge and Processes Underlying Language and Communication;
  • Language and Communication Abilities and Skills;
  • Using Language and Communication; and
  • Language and Communication Technologies.

In addition, the report describes the types of current R & D activities in these areas, and provides programmatic recommendations for key areas of investment and collaboration in language and communication research going forward. 

On behalf of the working group, IES is gathering information from a wide community interested in language and communication R & D through a recently released request for information (RFI).  The purpose of this RFI is to assist the working group in its efforts to further improve coordination and collaboration of R & D agendas related to language and communication across the Federal Government. If you are interested in submitting a response to the RFI, please do so by the deadline of December 30, 2016.

Written by Elizabeth Albro, Associate Commissioner of Teaching and Learning, National Center for Education Research

Trading the Number 2 Pencil for 2.0 Technology

Although traditional pencil and paper tests provide good information for many purposes, technology presents the opportunity to assess students on tasks that better elicit the real world skills called for by college and career standards. IES supports a number of researchers and developers who are using technology to develop better assessments through grants as well as the Small Business Innovation Research program. 

A screenshot of the GISA assessment intro page One example of the power of technology to support innovative assessment  is the Global, Integrated, Scenario-based Assessment (known as ‘GISA’), developed by John Sabatini and Tenaha O’Reilly at the Educational Testing Service as part of a grant supported by the Reading for Understanding Research Initiative.

Each GISA scenario is structured to resemble a timely, real-world situation. For example, one scenario begins by explaining that the class has been asked to create a website on green schools. The student is assigned the task of working with several students (represented by computer avatars) to create the website. In working through the scenario, the student engages in activities that are scaffolded to support students in summarizing information, completing a graphic organizer, and collaborating to evaluate whether statements are facts or opinions. The scenario provides a measure of each student’s ability to learn from text through assessing his or her knowledge of green schools before and after completing the scenario. This scenario is available on the ETS website along with more information about the principles on which GISA was built.

Karen Douglas, of the National Center for Education Research, recently spoke to Dr. Sabatini and Dr. O’Reilly on the role of technology in creating GISA, what users think of it, and their plans for continuing to develop technology-based assessments.

How did the use of technology contribute to the design of GISA?

Technological delivery creates many opportunities over more traditional paper and pencil test designs. On the efficiency side of the argument, items and tasks can be delivered over the internet in a standardized way and there are obvious advantages for automated scoring. However, the real advantage has to do with both the control over test environment and what can be assessed. We can more effectively simulate the digital environments that students use in school, leisure and, later, in the workforce. GISA uses scenario-based assessment to deliver items and tasks. During a scenario-based assessment students are given a plausible reason for reading a collection of thematically related materials. The purpose defines what is important to focus on as students work towards a larger goal. The materials are diverse and may reflect different perspectives and quality of information. 

Screenshot of a GISA forum on Green Schools

The student not only needs to understand these materials but also needs to evaluate and integrate them as they solve problems, make decisions, or apply what they learn to new situations. This design is not only more like the activities that occur in school, but also affords the opportunity for engaging students in deeper thinking. GISA also includes simulated students that may support or scaffold the test taker's understanding with good habits of mind such as the use of reading strategies. Items are sequenced to build up test takers’ understanding and to examine what parts of a more complex task students can or cannot do. In this way, the assessment serves as a model for learning while simultaneously assessing reading. Traditionally, the areas of instruction and assessment have not been integrated in a seamless manner.

What evidence do you have that GISA provides useful information about reading skills?

We have a lot more research to conduct, but thus far we have been able to create a new technology- delivered assessment that updates the aspects of reading that are measured and introduces a variety of new features.

Despite the novel interface, items, tasks, and format, students are able to understand what is expected of them. Our analyses indicate the test properties are good and that students can do a range of tasks that were previously untested in traditional assessments. While students may be developing their skills on more complex tasks, there is evidence they can do many of the components that feed into it. In this way the assessment may be more instructionally relevant.

Informally, we have received positive feedback on GISA from both teachers and students. Teachers view the assessment as better matching the types of activities they teach in the classroom, while students seem to enjoy the more realistic purpose for reading, the more relevant materials, and the use of simulated peers. 

What role do you think technology will play in future efforts to create better assessments?

We believe technology will play a greater role in how assessments are designed and delivered. Being able to provide feedback to students and better match the test to student needs are some areas where future assessments will drive innovation. More interactive formats, such as intelligent tutoring and gaming, will also grow over time. With new forms of technology available, the possibilities for meeting students’ educational needs increases dramatically.

What’s next for GISA?

We are using GISA in two additional grants. In one grant, we leverage the GISA designs for use with adults, a group for which there are few viable assessments. In the other grant we are using GISA to get a better understanding of how background knowledge affects reading comprehension.

For more information about the Reading for Understanding Research Initiative, read this post on the IES blog.

By Karen Douglas, Education Research Analyst, NCER, who oversees the Reading for Understanding Research Initiative

An IES-funded “Must Read” on Writing and Reading Disabilities

A paper based on an IES-funded grant has been recognized as a “must read” by the Council for Learning Disabilities.

IES-funded researcher, Stephen Hooper, and his colleagues were recently recognized by the Council for their paper: Writing disabilities and reading disabilities in elementary school students: Rates of co-occurrence and cognitive burden (PDF). The paper was written by Lara-Jeane Costa, Crystal Edwards, and Dr. Hooper and published in Learning Disability Quarterly. Every year, the Council for Learning Disabilities acknowledges outstanding work published in its journals and selected this paper as one of two Must Read pieces for 2016. The authors will present on the paper at the Council's annual conference in San Antonio this week (October 13-14, 2016).

This paper was funded through a grant from the National Center for Education Research (NCER) to examine written language development and writing problems, and the efficacy of an intervention aimed at improving early writing skills. The results of the paper found that the rate of students with both writing and reading disabilities increased from first to fourth grade and these students showed lower ability in language, fine motor skills and memory compared with students with neither disability or only a writing disability.  

The team continues their IES-funded work by looking at the efficacy of the Self-Regulated Strategy Development intervention on struggling middle school writers’ academic outcomes.

Written by Becky McGill-Wilkinson, Education Research Analyst, NCER