IES Blog

Institute of Education Sciences

Trading the Number 2 Pencil for 2.0 Technology

Although traditional pencil and paper tests provide good information for many purposes, technology presents the opportunity to assess students on tasks that better elicit the real world skills called for by college and career standards. IES supports a number of researchers and developers who are using technology to develop better assessments through grants as well as the Small Business Innovation Research program. 

A screenshot of the GISA assessment intro page One example of the power of technology to support innovative assessment  is the Global, Integrated, Scenario-based Assessment (known as ‘GISA’), developed by John Sabatini and Tenaha O’Reilly at the Educational Testing Service as part of a grant supported by the Reading for Understanding Research Initiative.

Each GISA scenario is structured to resemble a timely, real-world situation. For example, one scenario begins by explaining that the class has been asked to create a website on green schools. The student is assigned the task of working with several students (represented by computer avatars) to create the website. In working through the scenario, the student engages in activities that are scaffolded to support students in summarizing information, completing a graphic organizer, and collaborating to evaluate whether statements are facts or opinions. The scenario provides a measure of each student’s ability to learn from text through assessing his or her knowledge of green schools before and after completing the scenario. This scenario is available on the ETS website along with more information about the principles on which GISA was built.

Karen Douglas, of the National Center for Education Research, recently spoke to Dr. Sabatini and Dr. O’Reilly on the role of technology in creating GISA, what users think of it, and their plans for continuing to develop technology-based assessments.

How did the use of technology contribute to the design of GISA?

Technological delivery creates many opportunities over more traditional paper and pencil test designs. On the efficiency side of the argument, items and tasks can be delivered over the internet in a standardized way and there are obvious advantages for automated scoring. However, the real advantage has to do with both the control over test environment and what can be assessed. We can more effectively simulate the digital environments that students use in school, leisure and, later, in the workforce. GISA uses scenario-based assessment to deliver items and tasks. During a scenario-based assessment students are given a plausible reason for reading a collection of thematically related materials. The purpose defines what is important to focus on as students work towards a larger goal. The materials are diverse and may reflect different perspectives and quality of information. 

Screenshot of a GISA forum on Green Schools

The student not only needs to understand these materials but also needs to evaluate and integrate them as they solve problems, make decisions, or apply what they learn to new situations. This design is not only more like the activities that occur in school, but also affords the opportunity for engaging students in deeper thinking. GISA also includes simulated students that may support or scaffold the test taker's understanding with good habits of mind such as the use of reading strategies. Items are sequenced to build up test takers’ understanding and to examine what parts of a more complex task students can or cannot do. In this way, the assessment serves as a model for learning while simultaneously assessing reading. Traditionally, the areas of instruction and assessment have not been integrated in a seamless manner.

What evidence do you have that GISA provides useful information about reading skills?

We have a lot more research to conduct, but thus far we have been able to create a new technology- delivered assessment that updates the aspects of reading that are measured and introduces a variety of new features.

Despite the novel interface, items, tasks, and format, students are able to understand what is expected of them. Our analyses indicate the test properties are good and that students can do a range of tasks that were previously untested in traditional assessments. While students may be developing their skills on more complex tasks, there is evidence they can do many of the components that feed into it. In this way the assessment may be more instructionally relevant.

Informally, we have received positive feedback on GISA from both teachers and students. Teachers view the assessment as better matching the types of activities they teach in the classroom, while students seem to enjoy the more realistic purpose for reading, the more relevant materials, and the use of simulated peers. 

What role do you think technology will play in future efforts to create better assessments?

We believe technology will play a greater role in how assessments are designed and delivered. Being able to provide feedback to students and better match the test to student needs are some areas where future assessments will drive innovation. More interactive formats, such as intelligent tutoring and gaming, will also grow over time. With new forms of technology available, the possibilities for meeting students’ educational needs increases dramatically.

What’s next for GISA?

We are using GISA in two additional grants. In one grant, we leverage the GISA designs for use with adults, a group for which there are few viable assessments. In the other grant we are using GISA to get a better understanding of how background knowledge affects reading comprehension.

For more information about the Reading for Understanding Research Initiative, read this post on the IES blog.

By Karen Douglas, Education Research Analyst, NCER, who oversees the Reading for Understanding Research Initiative

Why We Still Can Learn from Evaluations that Find No Effects

By Thomas Wei, Evaluation Team Leader

As researchers, we take little pleasure when the programs and policies we study do not find positive effects on student outcomes. But as J.K. Rowling, author of the Harry Potter series, put it: there are “fringe benefits of failure.” In education research, studies that find no effects can still reveal important lessons and inspire new ideas that drive scientific progress.

On November 2, The Institute of Education Sciences (IES) released a new brief synthesizing three recent large-scale random assignment studies of teacher professional development (PD). As a nation we invest billions of dollars in PD every year, so it is important to assess the impact of those dollars on teaching and learning. These studies are part of an evaluation agenda that IES has developed to advance understanding of how to help teachers improve.

One of the studies focused on second-grade reading teachers, one on fourth-grade math teachers, and one on seventh-grade math teachers. The PD programs in each study emphasized building teachers’ knowledge of content or content-specific pedagogy. The programs combined summer institutes with teacher meetings and coaching during the school year. These programs were compared to the substantially less intensive PD that teachers typically received in study districts.

All three studies found that the PD did not have positive impacts on student achievement. Disappointing? Certainly. But have we at least learned something useful? Absolutely.

For example, the studies found that the PD did have positive impacts on teachers’ knowledge and some instructional practices. This tells us that intensive summer institutes with periodic meetings and coaching during the school year may be a promising format for this kind of professional development. (See the graphic above and chart below, which are from a snapshot of one of the studies.)

But why didn’t the improved knowledge and practice translate to improved student achievement? Educators and researchers have long argued that effective teachers need to have strong knowledge of the content they teach and know how best to convey the content to their students. The basic logic behind the content-focused PD we studied is to boost both of these skills, which were expected to translate to better student outcomes. But the findings suggest that this translation is actually very complex. For example, at what level do teachers need to know their subjects? Does a third-grade math teacher need to be a mathematician, just really good at third-grade math, or somewhere in between? What knowledge and practices are most important for conveying the content to students? How do we structure PD to ensure that teachers can master the knowledge and practice they need?

When we looked at correlational data from these three studies, we consistently found that most of the measured aspects of teachers’ knowledge and practice were not strongly related to student achievement. This reinforces the idea that we may need to find formats for PD that can boost knowledge and practice to an even larger degree, or we may need to find other aspects of knowledge and practice for PD to focus on that are more strongly related to student achievement. This is a critical lesson that we hope will inspire researchers, developers, and providers to think more carefully about the logic and design of content-focused PD.

Scientific progress is often incremental and painstaking. It requires continual testing and re-testing of interventions, which sometimes will not have impacts on the outcomes we care most about. But if we are willing to step back and try to connect the dots from various studies, we can still learn a great deal that will help drive progress forward.

 

Bringing Evidence-based Practices to the Field

By Dr. Barbara Foorman, Director Emeritus, Florida Center for Reading Research, Florida State University

The Institute of Education Sciences recently released a What Works Clearinghouse (WWC) educator’s practice guide that has four recommendations to support the development of foundational reading skills that are critically important to every student’s success. The recommendations in Foundational Skills to Support Reading for Understanding in Kindergarten Through 3rd Grade are based on a comprehensive review of 15 years of research on reading, and guidance from a national panel of reading experts, of which I was the chair.

Recently, the Regional Educational Laboratory (REL) Southeast at Florida State University has developed a set of professional learning community (PLC) materials and videos to help teachers and other practitioners implement the guide’s recommendations in classrooms.

Over the past few months, REL Southeast has shared the practice guide and PLC materials with practitioners and policymakers in two states – North Carolina and Mississippi, which both have K-3 reading initiatives and reading coaches who assist with implementation. I’m excited by the feedback we are getting.

During these presentations, we shared the format of the ten 75-min PLC sessions and accompanying videos that demonstrate the recommendations and action steps in actual classrooms. We filmed the videos in partnership with Dr. Lynda Hayes, Director of the PK Yonge Developmental Research School at the University of Florida, and her primary grade teachers.

In North Carolina, we trained K–3 regional literacy consultants, elementary teachers and reading coaches, and higher education faculty on the PLC Facilitator’s Guide in Charlotte and Raleigh. The K-3 regional literacy consultants are organized by the North Carolina Department of Public Instruction.

In Mississippi, we trained the 90 Mississippi Department of Education reading coaches and district-supported special education specialists in Jackson. In turn, the state coaches will train the K–3 teachers who are a part of the reading initiative in the practice guide recommendations and action steps. Additionally, the coaches will work with the primary grade teachers in each of their assigned schools to implement the PLC. Having the state coaches oversee the implementation of the PLC will help ensure commitment and instill depth to the PLC sessions.

Also present at the training in Mississippi were faculty members from the University of Mississippi and Belhaven University. I accepted an invitation from the Mississippi Institutions of Higher Learning Literacy Council to speak to higher education faculty about the guide and PLC materials. The invitation is timely because Mississippi recently completed a study of teacher preparation for early literacy instruction.

I hope you will download the practice guide and PLC materials. If you have any thoughts, comments, or questions, please email Contact.IES@ed.gov. You can learn more about the work of the Regional Educational Laboratories program and REL Southeast on the IES website.  

Dr. Foorman is the Director of REL Southeast, located at Florida State University

Sharing the Power of Intensive Interventions for Students with Learning Disabilities

In 2013, the National Center for Special Education Research (NCSER) launched the Accelerating the Academic Achievement of Students with Learning Disabilities Research Initiative (A3). The goal was to develop and evaluate intensive interventions—such as curricula, instructional approaches and technology—that could improve the academic achievement of students with or at risk of a disability.

A five-year grant in this initiative went to Dr. Douglas Fuchs and Dr. Lynn Fuchs (pictured), of Vanderbilt University’s Peabody College, who for the past three years have been developing and piloting intensive interventions focused on improving students’ reading comprehension of informational texts and fraction and pre-algebra performance.

Earlier this month, the Fuchs joined Dr. Lou Danielson and Dr. Rebecca Zumeta Edmonds from the National Center on Intensive Interventions (NCII) for a webinar: “Intensive Intervention: What is it, Who it’s For, and Why it’s Important?” (NCII is a research initiative funded by the U.S. Department of Education’s Office of Special Education Programs.)

The NCII/A3 webinar was purposely held in October—which is Learning Disabilities Awareness Month—to raise awareness of research and resources to support students with learning disabilities. The session was recorded and is available through the NCII website or you can watch it below.

The panelists discussed the intensive intervention process, methods of identifying students not making adequate academic progress, and recent related research. Specifically, the Fuchs’ shared their research designing and piloting two innovative components that seek to expand responsiveness to intervention:

  • Capitalizing on the power of prior knowledge to build informational text comprehension; and
  • Capitalizing on the power of executive function to build fractions knowledge

As part of this NCSER A3 Initiative, these and other intervention components are being developed, integrated into comprehensive intervention programs, and rigorously tested. Please visit the project website to learn more and keep up to date with the latest findings from this research. Viewers of the recorded webinar can also learn more about implementation support resources available through NCII.

In the final years of their five-year NCSER grant, Doug and Lynn Fuchs will work to understand the efficacy of intensive interventions for improving outcomes for students with learning disabilities.  

Written by Sarah Brasiel, Program Officer, NCSER

Photo by Wolf Hoffmann,courtesy of Vanderbilt University


 

New Release: Forum Guide to Collecting and Using Disaggregated Data on Racial/Ethnic Subgroups

By the National Forum on Education Statistics’ Disaggregation of Racial/Ethnic Subgroups Working Group

Across the nation, our schools serve a diverse student population reflecting a wide range of backgrounds, experiences, interests, identities, and cultures. The more accurately education data reflect the diversity of the student population, the better prepared education practitioners will be to customize instructional and support services to meet those students’ needs.

Local and state members of the National Forum on Education Statistics (the Forum) convened a Data Disaggregation of Racial/Ethnic Subgroups Working Group to identify best practices for disaggregating data on racial/ethnic subgroups. The Forum Guide to Collecting and Using Disaggregated Data on Racial/Ethnic Subgroups is intended to identify some of the overarching benefits and challenges involved in data disaggregation; recommend appropriate practices for disaggregating racial/ethnic data in districts and states; and describe real-world examples of large and small education agencies disaggregating racial/ethnic data successfully. This resource will help state and district staff better understand the process of disaggregating data in the field of education. It can also help agency staff determine whether data disaggregation might be an appropriate analytical tool in their communities, and, if so, how they can successfully institute or advance a data disaggregation project in their agencies.

 

The guide is organized into the following chapters:

  • Chapter 1: Introduction to Data Disaggregation in Education Agencies explains the purpose of the document; describes the concept of data disaggregation for racial/ethnic subgroups; discusses why the issue is becoming increasingly important in many communities; refers to current U.S. population data; and provides a case study of why this type of data collection can be important and advantageous in a school district.
  • Chapter 2: Strategies for Disaggregating Racial/Ethnic Data Subgroups recommends specific strategies for disaggregating data, including tasks undertaken during the two major phases of the effort: (1) needs assessment and (2) project implementation.
  • Chapter 3: Case Studies offers an in-depth look at how the disaggregation of racial/ethnic subgroup data is already being implemented through a wide range of state and district case studies.

Examples from the case studies and other education agencies are used throughout the document to highlight real-world situations. For instance, readers will learn how Highline (Wash.) Public School District changed the information it gathered on students to support its community’s commitment to equity and how the Springdale (Ark.) School District is using data to better serve its growing population of students from the Marshall Islands.

The recommendations in the resource are not mandates. Districts and states are encouraged to adapt or adopt any recommendations they determine to be useful for their purposes.


About the National Forum on Education Statistics

The work of the National Forum on Education Statistics is a key aspect of the National Cooperative Education Statistics System. The Cooperative System was established to produce and maintain, with the cooperation of the states, comparable and uniform education information and data that are useful for policymaking at the federal, state, and local levels. To assist in meeting this goal, the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, established the Forum to improve the collection, reporting, and use of elementary and secondary education statistics. The Forum addresses issues in education data policy, sponsors innovations in data collection and reporting, and provides technical assistance to improve state and local data systems.

Members of the Forum establish working groups to develop best practice guides in data-related areas of interest to federal, state, and local education agencies. They are assisted in this work by NCES, but the content comes from the collective experience of working group members who review all products iteratively throughout the development process. After the working group completes the content and reviews a document a final time, publications are subject to examination by members of the Forum standing committee that sponsors the project. Finally, Forum members (approximately 120 people) review and formally vote to approve all documents prior to publication. NCES provides final review and approval prior to online publication.

The information and opinions published in Forum products do not necessarily represent the policies or views of the U.S. Department of Education, IES, or NCES. For more information about the Forum, please visit nces.ed.gov/forum or contact Ghedam Bairu at Ghedam.bairu@ed.gov.