Inside IES Research

Notes from NCER & NCSER

Inequity Persists in Gifted Programs

The National Center for Research on Gifted Education (NCRGE) at the University of Connecticut, in Phase I of a rigorous research agenda, examined how academically-gifted students are identified and served in three states in order to provide systematic information for the field. The research team focused especially on the representation of historically underserved groups in gifted education.

NCER recently spoke with the Center’s Principal Investigator, Del Siegle, a nationally-recognized expert on gifted education. 

What is the biggest challenge facing gifted educators today?

Unfortunately, many of our nation’s brightest students from underserved populations (e.g., Black, Hispanic, English Learner, and/or free and reduced-price lunch eligible) are not being identified as gifted and do not receive gifted education services. About 80% of states that completed the most recent National Association for Gifted Children’s State of the States survey indicated that underrepresentation of students from underserved populations was an important or very important issue in their state.

What did you find in your study of identification of underserved students for gifted programs?

During Phase I of our work, we analyzed standardized student achievement test data from three states that mandate gifted identification and programming. We found that schools were less likely to identify students from underserved groups as gifted—even in cases where the underserved child had similar achievement test scores. For example, students with similar test scores who received free and reduced price lunch were less than half as likely to be identified as gifted as students who didn’t receive free or reduced price lunch.

What identification practices are schools using?

Cognitive tests and teacher nominations were the most common identification tools across the three states we studied. The majority (90% to 96%) of the districts in all three states used these practices to select students. Identification for gifted services occurs most often in third grade. Districts seldom reassess identified students once they are identified and only about half reassess non-identified students in elementary schools at regular intervals. Screening all children and using a variety of identification criteria showed promise for reducing under-identification in one of our states.

How are students being serviced in gifted programs?

In the three states we studied, schools primarily focused on critical thinking and creativity followed by communication skills, research skills, and self-directed projects.  Mathematics and reading language arts acceleration was much less of a focus and were ranked among the bottom third of focus areas. Gifted students seldom receive gifted programming in core academic areas. Only 29% of the schools provided a separate gifted curriculum in reading/language arts. Only 24% of the schools had a separate gifted curriculum in mathematics. Gifted students spent 5 hours or more each week in regular education mathematics and reading/language arts classrooms. Of the 74% of schools reporting using pull-out services, only 32% offered separate gifted curriculum in reading/language arts and 28% offered separate gifted curriculum in math. 

What about gifted student growth in mathematics and reading?

In 3rd grade, gifted students are approximately 2 grade levels ahead of students not identified as gifted, but gifted students grow more slowly than non-gifted students between 3rd and 5th grade. Most grouping arrangements for gifted students had no impact on the growth of academic achievement. We believe much of this has to do with the limited advanced mathematics and reading instruction gifted students receive in their classrooms and gifted programs.

What is the next step in your research?

We are examining the effect of attending dedicated gifted classes in core content areas on academic achievement in reading/language arts and mathematics in a large, ethnically, economically, and linguistically diverse urban school district. Our research will compare the reading/language arts and mathematics achievement of gifted students in three different settings: schools offering a full-time gifted-only program with gifted classes in all subject areas, schools offering a part-time gifted-only program with gifted classes in mathematics, and schools offering a part-time gifted-only program with gifted classes in reading/language arts.

Read Across America with IES

Happy Read Across America Day! This year is the 10-year anniversary of this national pep rally for reading, and IES has supported the development of a number of tools to promote reading and literacy.

Did you know that many of the curricula and materials developed by IES researchers are available for free? These materials include reading on topics interesting to students, as well as guidance for teachers on how to engage and motivate students in discussions about what they read. For example, as part of the Reading for Understanding Initiative, IES invested in multiple curricula that are designed to help improve students’ reading comprehension and are available at no charge.

For students in preschool through grade 3, the Let’s Know! curriculum supplement uses easily-accessible books to help teach children about vocabulary, making inferences, and text structures like cause and effect. There’s also a Spanish version of this curriculum (¡Vamos Aprender!). You can gain access to the curriculum through the Language and Reading Research Consortium webpage.

Word Generation is a group of curricula developed for students in grades four through eight with a focus on teaching students to understand multiple perspectives, reason, and learn academic vocabulary, all through high-interest topics in science and social studies.

Example topic questions from units include:

  • When is a crime not a crime?

  • The Legacy of Alexander the Great: Great Leader or Power-Hungry Tyrant? and

  • Thinking About Natural Selection.

You can find more information on WordGen and download materials on their website.

Finally, for high school students, Promoting Adolescents’ Comprehension of Text (PACT) is an intervention aimed at motivating and engaging students to read and understand informational texts in social studies. Students learn vocabulary words and make connections between social studies topics and their own lives. For example, in a unit about the 1920s, students learn about economy and prosperity and complete activities such as listing three items they have purchased and determining whether they are “needs” or “wants,” and how this relates to a consumer economy. Sample materials are available for download on the PACT website.

Have fun celebrating Read Across America Day, and enjoy a book with the students in your lives!

By Becky McGill-Wilkinson, NCER Program Officer

 

Trading the Number 2 Pencil for 2.0 Technology

Although traditional pencil and paper tests provide good information for many purposes, technology presents the opportunity to assess students on tasks that better elicit the real world skills called for by college and career standards. IES supports a number of researchers and developers who are using technology to develop better assessments through grants as well as the Small Business Innovation Research program. 

A screenshot of the GISA assessment intro page One example of the power of technology to support innovative assessment  is the Global, Integrated, Scenario-based Assessment (known as ‘GISA’), developed by John Sabatini and Tenaha O’Reilly at the Educational Testing Service as part of a grant supported by the Reading for Understanding Research Initiative.

Each GISA scenario is structured to resemble a timely, real-world situation. For example, one scenario begins by explaining that the class has been asked to create a website on green schools. The student is assigned the task of working with several students (represented by computer avatars) to create the website. In working through the scenario, the student engages in activities that are scaffolded to support students in summarizing information, completing a graphic organizer, and collaborating to evaluate whether statements are facts or opinions. The scenario provides a measure of each student’s ability to learn from text through assessing his or her knowledge of green schools before and after completing the scenario. This scenario is available on the ETS website along with more information about the principles on which GISA was built.

Karen Douglas, of the National Center for Education Research, recently spoke to Dr. Sabatini and Dr. O’Reilly on the role of technology in creating GISA, what users think of it, and their plans for continuing to develop technology-based assessments.

How did the use of technology contribute to the design of GISA?

Technological delivery creates many opportunities over more traditional paper and pencil test designs. On the efficiency side of the argument, items and tasks can be delivered over the internet in a standardized way and there are obvious advantages for automated scoring. However, the real advantage has to do with both the control over test environment and what can be assessed. We can more effectively simulate the digital environments that students use in school, leisure and, later, in the workforce. GISA uses scenario-based assessment to deliver items and tasks. During a scenario-based assessment students are given a plausible reason for reading a collection of thematically related materials. The purpose defines what is important to focus on as students work towards a larger goal. The materials are diverse and may reflect different perspectives and quality of information. 

Screenshot of a GISA forum on Green Schools

The student not only needs to understand these materials but also needs to evaluate and integrate them as they solve problems, make decisions, or apply what they learn to new situations. This design is not only more like the activities that occur in school, but also affords the opportunity for engaging students in deeper thinking. GISA also includes simulated students that may support or scaffold the test taker's understanding with good habits of mind such as the use of reading strategies. Items are sequenced to build up test takers’ understanding and to examine what parts of a more complex task students can or cannot do. In this way, the assessment serves as a model for learning while simultaneously assessing reading. Traditionally, the areas of instruction and assessment have not been integrated in a seamless manner.

What evidence do you have that GISA provides useful information about reading skills?

We have a lot more research to conduct, but thus far we have been able to create a new technology- delivered assessment that updates the aspects of reading that are measured and introduces a variety of new features.

Despite the novel interface, items, tasks, and format, students are able to understand what is expected of them. Our analyses indicate the test properties are good and that students can do a range of tasks that were previously untested in traditional assessments. While students may be developing their skills on more complex tasks, there is evidence they can do many of the components that feed into it. In this way the assessment may be more instructionally relevant.

Informally, we have received positive feedback on GISA from both teachers and students. Teachers view the assessment as better matching the types of activities they teach in the classroom, while students seem to enjoy the more realistic purpose for reading, the more relevant materials, and the use of simulated peers. 

What role do you think technology will play in future efforts to create better assessments?

We believe technology will play a greater role in how assessments are designed and delivered. Being able to provide feedback to students and better match the test to student needs are some areas where future assessments will drive innovation. More interactive formats, such as intelligent tutoring and gaming, will also grow over time. With new forms of technology available, the possibilities for meeting students’ educational needs increases dramatically.

What’s next for GISA?

We are using GISA in two additional grants. In one grant, we leverage the GISA designs for use with adults, a group for which there are few viable assessments. In the other grant we are using GISA to get a better understanding of how background knowledge affects reading comprehension.

For more information about the Reading for Understanding Research Initiative, read this post on the IES blog.

By Karen Douglas, Education Research Analyst, NCER, who oversees the Reading for Understanding Research Initiative

An IES-funded “Must Read” on Writing and Reading Disabilities

A paper based on an IES-funded grant has been recognized as a “must read” by the Council for Learning Disabilities.

IES-funded researcher, Stephen Hooper, and his colleagues were recently recognized by the Council for their paper: Writing disabilities and reading disabilities in elementary school students: Rates of co-occurrence and cognitive burden (PDF). The paper was written by Lara-Jeane Costa, Crystal Edwards, and Dr. Hooper and published in Learning Disability Quarterly. Every year, the Council for Learning Disabilities acknowledges outstanding work published in its journals and selected this paper as one of two Must Read pieces for 2016. The authors will present on the paper at the Council's annual conference in San Antonio this week (October 13-14, 2016).

This paper was funded through a grant from the National Center for Education Research (NCER) to examine written language development and writing problems, and the efficacy of an intervention aimed at improving early writing skills. The results of the paper found that the rate of students with both writing and reading disabilities increased from first to fourth grade and these students showed lower ability in language, fine motor skills and memory compared with students with neither disability or only a writing disability.  

The team continues their IES-funded work by looking at the efficacy of the Self-Regulated Strategy Development intervention on struggling middle school writers’ academic outcomes.

Written by Becky McGill-Wilkinson, Education Research Analyst, NCER

A Conversation about Reading for Understanding

There is a lot of discussion these days about the importance of bringing many voices to the table when designing, implementing, and interpreting research studies. The usefulness of educational research is only enhanced when teachers, policmakers, and researchers work together to design studies and understand the findings.

The Educational Testing Service and the Council of Chief State School Officers (CCSSO) sponsored such an opportunity on May 18 and 19 by bringing together over 150 researchers, practitioners, instructional specialists, federal staff, and policy makers to reflect on the efforts of the of the Reading for Understanding Research Initiative (RfU).

Researchers from the six RfU teams shared what they learned about how to work together to design and implement new approaches and improve reading for understanding from PreK through high school. They also discussed what needs to be done to build on this effort.  (Check out this recent blog post to learn how some researchers are building on the work of the RfU Initiative.)

Sessions at the meeting were designed to promote discussion among attendees in order to take advantage of different viewpoints and  better understand the implications of the RfU findings and the relevance for practice and policy. The meeting culminated with two panels tasked with summarizing key points from these discussions. One panel focused on implications for research, and the other on policy implications. The presence of practitioners from the field, researchers, and policymakers led to a well-rounded conversation about how we can build upon the research of RfU and put it into action in the classroom.

The agenda, presentation slides, and webcasts of the closing panels are now available for viewing on the meeting website.  

Written by Karen Douglas, Education Research Analyst, NCER