IES Blog

Institute of Education Sciences

NCES Releases First-Ever Response Process Dataset—A Rich New Resource for Researchers

The NCES data file National Assessment of Educational Progress (NAEP) Response Process Data From the 2017 Grade 8 Mathematics Assessment (NCES 2020-102; documentation NCES 2020-134) introduces a new type of data—response process data—which was made possible by NAEP’s transition from paper to digitally based assessments in mathematics and reading in 2017. These new datasets allow researchers to go beyond analyzing students’ answers to questions as simply right or wrong; instead, researchers can examine the amount of time students spend on questions, the pathways they take through the assessment sections, and the tools they use while solving problems. 

NAEP reporting has hinted previously at the promise of response process data. With the release of the 2017 mathematics assessment results, NCES included a feature on The Nation’s Report Card website to show the different steps students took while responding to a question that assessed their multiplication skills. The short video below shows that students used a total of 397 different sequences to group four digits into two factors that yield a given product. The most popular correct and incorrect answer paths are shown in the video. Response process data, such as those summarized in this example item, can open new avenues for understanding how students work through math problems and identifying more detailed elements of response processes that could lead to common math errors.



In the newly released data, researchers can access student response process data from two 30-minute blocks of grade 8 mathematics assessment questions (or a total of 29 test items) and a 15-minute survey questionnaire where students responded to questions about their demographic characteristics, opportunities to learn in and outside of school, and educational experiences. Researchers can explore logs of the response process data collected from each student along with a file containing students’ raw responses and scored responses, time stamps, and demographics. In addition, researchers can explore a file that summarizes defined features of students’ interactions with the assessment, such as the number of seconds spent on specific questions or the number of times the calculator was opened across all students.

To explore this response process dataset, interested researchers should apply for a restricted-use license and request access to the files through the NCES website. By providing this dataset to a wide variety of researchers, NCES hopes to encourage and enable a new domain of research on developing best practices for the use and interpretation of student response process data.

 

By Jan Marie Alegre and Robert Finnegan, Educational Testing Service

Small Changes to Textbook Design Can Make a Big Difference for Student Learning

During spring 2020, the COVID-19 pandemic forced the closure of millions of U.S. schools. As schools reopened this fall, conversations have revolved around using this unique situation as a chance to rethink education and how students learn. When we think about innovative ways to improve education, ideas tend to gravitate towards radical changes to the classroom experience, expensive interventions, and costly professional development. Everyone is looking for the next “big” idea, but perhaps part of the solution lies in a more subtle, inexpensive, and less disruptive change that may be as impactful as a completely new education approach: strategic revisions to the materials teachers and students already use in their classrooms (whether in person or virtual).

Textbooks (or ebooks) and supplemental education materials are central to providing students with the content knowledge and practice experiences to support mastery of academic skills. Textbook developers spend significant time and effort to ensure that the content in those textbooks aligns to standards and provides students with the information and examples needed to understand key concepts. However, even with age-appropriate content and high-quality practice exercises, textbooks may not be effective as learning tools if they present and sequence information in a way that is not aligned to what we know about how people learn.

You may be wondering how much room there is for improvement—textbooks seem pretty good at delivering content as is, right? Actually, findings from three IES-funded projects demonstrate that there are multiple ways to improve texts and student understanding of key concepts. Here are a few of those ways:

 

Present a wide range of fraction practice problems. Textbooks focused on fractions learning tend to present more problems with equal denominators for addition and subtraction problems than for multiplication problems. Why does this matter? In IES-funded research, David Braithwaite and Bob Siegler showed that students pick up on this bias. As a result, students are more likely to make errors on equal denominator fractions multiplication problems because they are so used to seeing those problems when practicing fractions arithmetic and subtraction. The recommended minor change is to include a wider range of fractions practice problems, including equal denominator multiplication problems, to ensure that students do not form irrelevant associations between superficial features of a practice problem and the solution strategies they are practicing.

 

Provide students with a mix of practice problems that require different strategies rather than practice problems of the same type. Typical math practice involves solving the same type of problem repeatedly to practice the specific solution strategy a student just learned. However, across numerous IES-funded studies, Douglas Rohrer and his research team have shown that students benefit substantially more from math practice that involves a mix of problems that require different strategies (those learned in previous lessons mixed with those just learned). One of the major benefits of this approach is that students get practice choosing which strategy to use for a particular problem. Rohrer and his team found that across 13,505 practice problems from six popular math textbooks, only 9.7% of those problems were mixed up in this way. The recommended minor change is to simply mix up the problem sets so that students have more experiences encountering different types of problems in a single sitting.

 

Where and how you place visuals on textbook pages matters, especially when you want students to compare them. Textbooks typically use visuals such as diagrams and photos to help reinforce key concepts. In an IES-funded study, Bryan Matlen and colleagues examined anatomy and evolution chapters within three popular middle school science textbooks and found an average of 1.8 visuals per page. Students were expected to make comparisons using about a third of those visuals. Of those they had to compare, about half were positioned in suboptimal ways—that is, the images were not presented in a way that made it easy to identify how the elements of one image compare to the elements of the other. For example, imagine a student is asked to compare two x-ray images of hands to identify a bone that is missing from one of them. This task is much harder if one hand is shown upside down and the other is right-side up or perpendicular to the first image. Consistent with this example, Matlen and colleagues have conducted studies showing that visual comparisons are more effective when the features of the visuals that need to be compared are spatially aligned. The recommended minor change is to be intentional about the placement of visuals that students are supposed to be comparing; make sure they are placed in optimal alignment to each other so that it is easier for students to see how the features of one correspond to those of the other.

 

In sum, transformative, radical ideas about how to improve education are interesting to brainstorm about, but sometimes the key to improvement is identifying small changes that can deliver big results.


Written by Erin Higgins (Erin.Higgins@ed.gov), Program Officer for the Cognition and Student Learning Program, National Center for Education Research.

 

Exploring How COVID-19 Affects Learning and Critical Thinking

Our nation continues to navigate a unique and challenging year due to the COVID-19 pandemic. In our first blog post in this series, we highlighted how educators, students, families, and researchers are adapting while trying to engage in opportunities to support learning. COVID-19 has created numerous challenges in education research with many studies needing to be modified or put on hold. At the same time, new research questions arise focusing on the impact of the pandemic on student learning, engagement, and achievement. Here, we highlight two IES-funded projects that are conducting timely and relevant research exploring the impact of COVID-19 on learning and critical thinking.    

Guanglei Hong, Lindsey Richland, and their research team at University of Chicago and University of California, Irvine have received supplemental funds to build off their current grant, Drawing Connections to Close Achievement Gaps in Mathematics. The research team will conduct a study during the 2020-21 school year to explore the relationship between student anxiety about the health risks associated with COVID-19 and their math learning experiences. They predict that pressure and anxiety, like that induced by COVID-19, use the same executive function resources that students need to engage in higher order thinking and reasoning during math instruction, which negatively affects the ability to learn. Through this study, the research team will also test whether particular instructional approaches reduce the effects of pressure and anxiety on learning. These findings will be useful for teachers and students in the near term as they navigate the COVID-19 pandemic and longer term for students who experience anxiety due to a variety of other reasons.

In addition, IES has funded an unsolicited grant to Clarissa Thompson at Kent State University to investigate whether an education intervention aimed at decreasing whole number bias errors can help college-aged students and adults more accurately interpret health statistics about COVID-19. During the COVID-19 pandemic, the public receives daily updates about the number of people locally, nationally, and globally who are infected with and die from COVID-19. Beliefs about the risk of getting a disease is a key predictor of engagement in prevention behaviors. Understanding the magnitude of one’s risk may require making sense of numerical health information, often presented in the form of rational numbers, such as fractions, whole number frequencies, and percentages. An intervention to decrease whole number bias errors and improve understanding of rational numbers has the immediate and pressing benefit of being able to accurately reason about the risk of COVID-19 and other health risks. This skill is also critical for success in science, technology, engineering, and mathematics (STEM) fields.

Both of these projects offer opportunities to better understand learning and critical thinking in the midst of the pandemic. They will also provide the field with generalizable information about ways to improve learning in STEM fields. Stay tuned for more COVID-19 related education research discussions as we continue this series on our blog.

 


Written by Christina Chhin (christina.chhin@ed.gov) and Erin Higgins (erin.higgins@ed.gov), National Center for Education Research (NCER).

This is the third in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19 and Measuring Attendance during COVID-19: Considerations for Synchronous and Asynchronous Learning Environments.

 

Measuring Attendance during COVID-19: Considerations for Synchronous and Asynchronous Learning Environments

The National Center for Rural Education Research Networks (NCRERN) is an IES-funded R&D Center that has established a continuous improvement network of 50 rural districts in New York and Ohio. The purpose of the Network is to build the capacity of rural school districts and supporting state agencies to use their own data to improve the education of their students. Districts are currently tackling the problem of student absenteeism through piloting, evaluating, and improving various interventions.  Katherine Kieninger, David Hersh, and Jennifer Ash describe how the Network is tackling the problem of measuring attendance during COVID-19, taking into consideration the various learning environments.

 

NCRERN has been working to develop a viable attendance construct given that districts and schools are currently struggling with how to define and track attendance for remote or blended learning models. When students are not physically present, the typical observe-and-log model of attendance tracking is not an option. However, not tracking attendance is not an option either given the importance of attendance for identifying at-risk students, predicting key student outcomes, and acting during the pandemic as a proxy for the general safety and well-being of students.

We considered several possible attendance constructs and assessed them by the degree to which they met the following criteria. First, a viable construct should be measurable equitably across all students and learning environments, including in-person, synchronous and asynchronous virtual internet-accessible environments, and asynchronous environments without internet access. The attendance construct should also be simple to understand, easy to capture, and quick to collect. Finally, access to technology and reliable low-cost high-speed internet must be considered, especially in rural areas lacking such infrastructure.

We concluded that tracking student exposure to instructional content best meets these criteria, as seen in the table below. While not without its own challenges, exposure to content is the least complicated option, can be tracked across learning environments consistently and is the closest in principal to what in-person attendance captures.

 

Attendance Construct

Simple

Easy to Capture

All Students

High Frequency

Reliable & Valid

Consistent Across Grade Levels

Consistent across Virtual OR In-Person

In-Person Attendance

X

Exposure to Instructional Content

Participation

X

?

?

Assignment Submission

X

Engagement

X

X

X

Mastery

X

?

X

X

X

 

In guidance provided to Network districts, we use the table below to outline how to define tracking exposure to content across the learning environments, suggest capture options, and provide a non-exhaustive list of considerations for school district stakeholders. Districts should acknowledge that a student can float between learning environments. For example, an in-person student in quarantine and healthy enough to continue classwork will become a virtual learner. Based on their individual home context, this could place a student in any of the three virtual environments. Creating a plan for seamless attendance tracking across learning environments is key to measuring attendance with fidelity.

 

Attendance Construct: Exposure to Instructional Content

Learning Environment

Definition

Capture Options

Considerations

In-Person

Student is present

Student Information System (SIS)

  • Will in-person students be able to log in for remote learning if they are not able to come to school?*

*For example, a student must miss school for an extended period (i.e. needs to quarantine)

Virtual

 

Synchronous

Student is present for virtual class

Student Information System (SIS)

  • Can you avoid concurrent classes for students in the same family?
  • If a student loses internet, do you have an asynchronous back-up option for course content?

Virtual

 

Asynchronous with Internet Access

Student affirmatively accessed content

Learning Management System (LMS) log-in with a minimum time threshold

OR

Daily form completion (form asks students on what content they worked)

  • How/when will teachers capture results in the SIS?
  • How do you count daily attendance for different class periods?
  • If using LMS log-in option, what is the minimum amount of time a student needs to be logged in?
  • If using a daily form, what question(s) will you ask?
    • We recommend a low threshold equivalent to something a student who was present could answer regardless of their level of engagement.

Virtual

 

Asynchronous without Internet Access

Student affirmatively accessed content

Contact each student for whom the above guidance does not or cannot apply.

 

Student is absent only if they have not worked on any instructional content.

  • How will you know when a student does not have internet access, therefore need to call?
  • How do you contact the students who many not have consistent cell service or a landline?
  • What time of day will you contact students or caregivers?
  • How many attempts does a teacher or staff member need to make per day before a student is marked absent?
  • How will you address unresponsive caregivers?
  • How will you count daily attendance for different class periods in MS/HS?
  • If students have multiple content teachers, who will reach out to students?

 

In the guidance, we also considered assignment submission as a potentially viable attendance construct. An equitable implementation of an assignment submission construct across all learning environments, however, would result in one unique challenge: Would a school district be willing to mark an in-person student absent for the day if the student failed to submit an assignment? While surmountable, addressing this issue would be challenging in the short-term.

As school districts finalize their attendance measurement plans, they will need to ensure that any selected attendance measurements are feasible and sustainable for the duration of the school year for the individuals capturing attendance. This includes considering how long tracking attendance will take for teachers and additional staff members daily. Gathering feedback from teachers and staff regarding the ongoing execution of gathering attendance data is key to ensuring reliable attendance tracking within a district.

 

We welcome individuals to reach out to NCRERN with additional recommendations or considerations. We are also interested in hearing how attendance is being measured in practice at school districts across the country. Connect with NCRERN via email at ncrern@gse.harvard.edu.


Katherine Kieninger, M.P.A. is the Ohio State Network Manager for the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

David Hersh, J.D., Ph.D. is the Director of Proving Ground at the Center for Education Policy Research at Harvard University.

Jennifer Ash, Ph.D. is the Director of the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

This is the second in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19.

 

What National and International Assessments Can Tell Us About Technology in Students’ Learning: Technology Instruction, Use, and Resources in U.S. Schools

As schools and school districts plan instruction amid the current coronavirus pandemic, the use of technology and digital resources for student instruction is a key consideration.

In this post, the final in a three-part series, we present results from the NAEP TEL and ICILS educator questionnaires (see the first post for information about the results of the two assessments and the second post for the results of the student questionnaires). The questionnaires ask about the focus of technology instruction in schools, school resources to support technology instruction, and the use of technology in teaching practices.

It is important to note that NAEP TEL surveys the principals of U.S. eighth-grade students, while ICILS surveys a nationally representative sample of U.S. eighth-grade teachers.

Emphasis in technology instruction

According to the 2018 NAEP TEL principal questionnaire results, principals1 of 61 percent of U.S. eighth-grade students reported that prior to or in eighth grade, much of the emphasis in information and communication technologies (ICT) instruction was placed on teaching students how to collaborate with others. In addition, principals of 51 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to find information or data to solve a problem. In comparison, principals of only 10 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to run simulations (figure 1).



According to the 2018 ICILS teacher questionnaire results, 40 percent of U.S. eighth-grade teachers reported a strong emphasis on the use of ICT instruction to develop students’ capacities to use computer software to construct digital work products (e.g., presentations). In addition, 35 percent of eighth-grade teachers reported a strong emphasis on building students’ capacities to access online information efficiently. In comparison, 17 percent reported a strong emphasis on developing students’ capacities to provide digital feedback on the work of others (figure 2).  



Resources at school

NAEP TEL and ICILS used different approaches to collect information about technology-related school resources. NAEP TEL asked about hindrances that limited schools’ capabilities to provide instruction in technology or engineering concepts. According to NAEP TEL, principals of 5 percent of U.S. eighth-grade students indicated that a lack or inadequacy of internet connectivity was a “moderate” or “large” hindrance in their schools. However, principals of 61 percent of eighth-grade students indicated that a lack of time due to curriculum content demands was a “moderate” or “large” hindrance. Principals of 44 percent of eighth-grade students indicated that a lack of qualified teachers was a “moderate” or “large” hindrance (figure 3).



ICILS asked about the adequacy of school resources to support ICT use in teaching. Eighty-six percent of U.S. teachers “agreed” or “strongly agreed” that technology was considered a priority for use in teaching. Nearly three-quarters of teachers “agreed” or “strongly agreed” that their schools had access to sufficient digital learning resources and had good internet connectivity (74 and 73 percent, respectively) (figure 4).



Use of technology in teaching

Teachers of U.S. eighth-grade students reported that they often used technology in their teaching practices. ICILS found that 64 percent of U.S. teachers regularly (i.e., “often” or “always”) used technology to present class instruction. Fifty-four percent of teachers regularly used technology to communicate with parents or guardians about students’ learning. In addition, 45 percent of teachers regularly used technology to provide remedial or enrichment support to individual or small groups of students, and a similar percentage (44 percent) regularly used technology to reinforce skills through repetition of examples (figure 5).



ICILS also reported results from U.S. eighth-grade teachers about how they collaborated on technology use. About three-quarters “agreed” or “strongly agreed” that they talked to other teachers about how to use technology in their teaching. Similarly, about three-quarters “agreed” or “strongly agreed” that they shared technology resources with other teachers in the school. More than half of the teachers “agreed” or “strongly agreed” that they collaborated with colleagues on the development of technology-based lessons.

Overall, the responses of teachers and principals suggested that emphasis had been put on different aspects of instruction for eighth-grade students. The majority of schools had enough digital resources and adequate internet access. However, technologies were also used differently in different teaching practices.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools. The 2018 ICILS also included a principal questionnaire, but the questions are not directly related to the topics included in this blog. Data reported in the text and figures are rounded to the nearest integer.

 

Resources for more information:

 

By Yan Wang, AIR, and Taslima Rahman, NCES


[1] The unit of analysis for TEL principal responses is student.