IES Blog

Institute of Education Sciences

Addressing COVID-19’s Disruption of Student Assessment

Under an IES grant, the RAND Corporation, in collaboration with NWEA, is developing strategies for schools and districts to address the impacts of COVID-19 disruptions on student assessment programs. The goal is to provide empirical evidence of the strengths and limitations of strategies for making decisions in the absence of assessment data. Jonathan Schweig, Andrew McEachin, and Megan Kuhfeld describe early findings from surveys and structured interviews regarding key concerns of districts and schools. 

 

As a first step, we surveyed assessment and research coordinators from 23 school districts (from a sample of 100 districts) and completed follow-up interviews with seven of them on a variety of topics, including the re-entry scenario for their district, the planning activities that they were not able to perform this year due to coronavirus-based disruptions to spring 2020 assessments, and the strategies they were employing to support instructional planning in the absence of assessment data. While the research is preliminary and the sample of respondents is not nationally representative, the survey and interview responses identified two key concerns arising from the lack of spring 2020 assessment data which has made it challenging to examine student or school status and change over time, especially as COVID-19 has differential impacts on student subgroups:

 

  • Making course placement decisions. Administrators typically rely on spring assessment scores—often in conjunction with other assessment information, course grades, and teacher recommendations—to make determinations for course placements, such as who should enroll in accelerated or advanced mathematics classes. 
  • Evaluating programs or district-wide initiatives. Many districts monitor the success of these programs internally by looking at year-to-year change or growth for schools or subgroups of interest. 

 

How are school systems responding to these challenges? Not surprisingly, the responses vary depending on local contexts and resources. Where online assessments were not feasible in spring 2020, some school districts used older testing data to make course recommendations, either from the winter or from the previous school year. Some districts relaxed typical practice and provided more autonomy to individual schools, relying on school staff to exercise local judgment around course placements and using metrics like grades and teacher recommendations. Other districts reported projecting student scores based on student assessment histories. Relatedly, some districts were already prepared for this decision because they had recently experienced difficulties with adopting an online assessment system and had to address similar problems caused by large numbers of missing or invalid tests.

 

School districts also raised concerns about whether assessments administered during the 2020-21 school year would be valid and comparable so that they could be used in student placement and program evaluation decisions. These concerns included the following:

  • Several respondents raised concerns about the trustworthiness of remote assessment data collected this fall and the extent to which results could be interpreted as valid indicators of student achievement or understanding.
  • Particularly for districts that started the 2020-21 school year remotely, respondents were concerned about student engagement and motivation and the possibility of students rushing assessments, running into technological or internet barriers, or seeking assistance from guardians or other resources. 
  • Respondents raised questions about the extent to which available assessment scores are representative of school or district performance as a whole. Given that vulnerable students (for example, students with disabilities, students experiencing homelessness) may be the least likely to have access to remote instruction and assessments, it is likely that the students who are not assessed this year are different from students who are able to be assessed.
  • Other respondents noted that they encountered resistance from parents around fall assessment because they prioritized student well-being (for example, safety, sense of community, and social and emotional well-being) more so than academics. This is a perspective that resonates with recent findings from a nationally representative sample of teachers and school leaders drawn from RAND’s American Educator Panel (AEP).

 

In the next phase of the work, the research team plans to:

  • Conduct a series of simulation and empirical studies regarding the most common strategies that the district respondents indicated they were using to make course placement decisions and to evaluate programs or district-wide initiatives.
  • Provide a framework to help guide local research on the intended (and unintended) consequences for school and school system decision making when standardized test scores are not available.

 

We welcome individuals to reach out to RAND with additional recommendations or considerations. We are also interested in hearing how districts are approaching course placement, accountability, and program evaluation across the country. Connect with the research team via email at jschweig@rand.org.

 


Jonathan Schweig is a social scientist at the nonprofit, nonpartisan RAND Corporation.

Andrew McEachin is a senior policy researcher at the nonprofit, nonpartisan RAND Corporation.

Megan Kuhfeld is a researcher at NWEA.

Webinar Recap: EdTech Resources for Special Education Practitioners

It goes without saying the COVID19 pandemic has and continues to have a profound effect on education. Students are adjusting to hybrid or fully remote learning, and educators are continuing to make complex decisions about how best to support students in the new normal.

On October 28, 2020, InnovateEDU and the Educating All Learners Alliance hosted a webinar focused on education technology resources for special education. More than 1,100 practitioners joined the event in real-time.

 

 

The webinar featured video demonstrations of five special education technology tools that were developed through the IES Small Business Innovation Research Program and ED’s Office of Special Education Educational Technology, Media, and Materials for Individuals with Disabilities Program. The event also included conversations with special education practitioners and researchers who provided perspectives on the role of special education and technology to meet the needs of all students. The webinar involved a variety of resources and opportunities, including:

 

During the webinar, practitioners participated by adding comments in the chat box with a “wish list” of education technology they would like to have now to support teaching and learning. Participants entered dozens of responses, many calling for increased connectivity and access to hardware and software, especially in rural areas. Other responses focused on education technologies for teachers, students with or at-risk for disabilities, and parents and caregivers.

Following are just a few of the entries:

 

For Teachers

  • “More coaching tools to use with children who are learning remotely to provide instantaneous feedback”
  • “Descriptions that allow teachers to at-a-glance identify the features a program offers to match to the features that their students need”
  • “Using data to support teachers and students with decisions that move learning forward.”
  • “Resources that I can use to assist with non-compliant behaviors and keeping their attention in person and virtually.”
  • Making it possible for students to show their work for math so that we can see that rather than just their answers.”
  • “Common share place for all teachers.”
  • “I am looking for a way to deliver instructions to the home distantly”

 

For Students with Disabilities

  • “Teaching students how to be self-determined learners.”
  • “Build this skill set from kindergarten.”
  • “Develop and implement collaborative activities”
  • “My nonverbal students need hands on.”
  • “Engagement and motivation; remote resources.”
  • “Student choice and voice.”

 

For Parents

  • “Make it a family affair / Zoom with family member supporting on other side.”
  • “A resource that we can use to incorporate the parent or group home worker that have to navigate these different learning apps for the student.”
  • “Easy-to-follow videos that we can use to show parents and students how to use these resources when they aren’t in front of us.”

 

Lastly, one of the teachers provided a comment: “We need more of these events.”  From everyone involved in the October 28 webinar, thanks for attending. We are planning for more events like this one soon.

 


Edward Metz (Edward.Metz@ed.gov) is a research scientist at the Institute of Education Sciences in the US Department of Education.

Tara Courchaine (Tara.Courchaine@ed.gov) is a program officer at the Office of Special Education Programs in the US Department of Education.

Exploring How COVID-19 Affects Learning and Critical Thinking

Our nation continues to navigate a unique and challenging year due to the COVID-19 pandemic. In our first blog post in this series, we highlighted how educators, students, families, and researchers are adapting while trying to engage in opportunities to support learning. COVID-19 has created numerous challenges in education research with many studies needing to be modified or put on hold. At the same time, new research questions arise focusing on the impact of the pandemic on student learning, engagement, and achievement. Here, we highlight two IES-funded projects that are conducting timely and relevant research exploring the impact of COVID-19 on learning and critical thinking.    

Guanglei Hong, Lindsey Richland, and their research team at University of Chicago and University of California, Irvine have received supplemental funds to build off their current grant, Drawing Connections to Close Achievement Gaps in Mathematics. The research team will conduct a study during the 2020-21 school year to explore the relationship between student anxiety about the health risks associated with COVID-19 and their math learning experiences. They predict that pressure and anxiety, like that induced by COVID-19, use the same executive function resources that students need to engage in higher order thinking and reasoning during math instruction, which negatively affects the ability to learn. Through this study, the research team will also test whether particular instructional approaches reduce the effects of pressure and anxiety on learning. These findings will be useful for teachers and students in the near term as they navigate the COVID-19 pandemic and longer term for students who experience anxiety due to a variety of other reasons.

In addition, IES has funded an unsolicited grant to Clarissa Thompson at Kent State University to investigate whether an education intervention aimed at decreasing whole number bias errors can help college-aged students and adults more accurately interpret health statistics about COVID-19. During the COVID-19 pandemic, the public receives daily updates about the number of people locally, nationally, and globally who are infected with and die from COVID-19. Beliefs about the risk of getting a disease is a key predictor of engagement in prevention behaviors. Understanding the magnitude of one’s risk may require making sense of numerical health information, often presented in the form of rational numbers, such as fractions, whole number frequencies, and percentages. An intervention to decrease whole number bias errors and improve understanding of rational numbers has the immediate and pressing benefit of being able to accurately reason about the risk of COVID-19 and other health risks. This skill is also critical for success in science, technology, engineering, and mathematics (STEM) fields.

Both of these projects offer opportunities to better understand learning and critical thinking in the midst of the pandemic. They will also provide the field with generalizable information about ways to improve learning in STEM fields. Stay tuned for more COVID-19 related education research discussions as we continue this series on our blog.

 


Written by Christina Chhin (christina.chhin@ed.gov) and Erin Higgins (erin.higgins@ed.gov), National Center for Education Research (NCER).

This is the third in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19 and Measuring Attendance during COVID-19: Considerations for Synchronous and Asynchronous Learning Environments.

 

Measuring Attendance during COVID-19: Considerations for Synchronous and Asynchronous Learning Environments

The National Center for Rural Education Research Networks (NCRERN) is an IES-funded R&D Center that has established a continuous improvement network of 50 rural districts in New York and Ohio. The purpose of the Network is to build the capacity of rural school districts and supporting state agencies to use their own data to improve the education of their students. Districts are currently tackling the problem of student absenteeism through piloting, evaluating, and improving various interventions.  Katherine Kieninger, David Hersh, and Jennifer Ash describe how the Network is tackling the problem of measuring attendance during COVID-19, taking into consideration the various learning environments.

 

NCRERN has been working to develop a viable attendance construct given that districts and schools are currently struggling with how to define and track attendance for remote or blended learning models. When students are not physically present, the typical observe-and-log model of attendance tracking is not an option. However, not tracking attendance is not an option either given the importance of attendance for identifying at-risk students, predicting key student outcomes, and acting during the pandemic as a proxy for the general safety and well-being of students.

We considered several possible attendance constructs and assessed them by the degree to which they met the following criteria. First, a viable construct should be measurable equitably across all students and learning environments, including in-person, synchronous and asynchronous virtual internet-accessible environments, and asynchronous environments without internet access. The attendance construct should also be simple to understand, easy to capture, and quick to collect. Finally, access to technology and reliable low-cost high-speed internet must be considered, especially in rural areas lacking such infrastructure.

We concluded that tracking student exposure to instructional content best meets these criteria, as seen in the table below. While not without its own challenges, exposure to content is the least complicated option, can be tracked across learning environments consistently and is the closest in principal to what in-person attendance captures.

 

Attendance Construct

Simple

Easy to Capture

All Students

High Frequency

Reliable & Valid

Consistent Across Grade Levels

Consistent across Virtual OR In-Person

In-Person Attendance

X

Exposure to Instructional Content

Participation

X

?

?

Assignment Submission

X

Engagement

X

X

X

Mastery

X

?

X

X

X

 

In guidance provided to Network districts, we use the table below to outline how to define tracking exposure to content across the learning environments, suggest capture options, and provide a non-exhaustive list of considerations for school district stakeholders. Districts should acknowledge that a student can float between learning environments. For example, an in-person student in quarantine and healthy enough to continue classwork will become a virtual learner. Based on their individual home context, this could place a student in any of the three virtual environments. Creating a plan for seamless attendance tracking across learning environments is key to measuring attendance with fidelity.

 

Attendance Construct: Exposure to Instructional Content

Learning Environment

Definition

Capture Options

Considerations

In-Person

Student is present

Student Information System (SIS)

  • Will in-person students be able to log in for remote learning if they are not able to come to school?*

*For example, a student must miss school for an extended period (i.e. needs to quarantine)

Virtual

 

Synchronous

Student is present for virtual class

Student Information System (SIS)

  • Can you avoid concurrent classes for students in the same family?
  • If a student loses internet, do you have an asynchronous back-up option for course content?

Virtual

 

Asynchronous with Internet Access

Student affirmatively accessed content

Learning Management System (LMS) log-in with a minimum time threshold

OR

Daily form completion (form asks students on what content they worked)

  • How/when will teachers capture results in the SIS?
  • How do you count daily attendance for different class periods?
  • If using LMS log-in option, what is the minimum amount of time a student needs to be logged in?
  • If using a daily form, what question(s) will you ask?
    • We recommend a low threshold equivalent to something a student who was present could answer regardless of their level of engagement.

Virtual

 

Asynchronous without Internet Access

Student affirmatively accessed content

Contact each student for whom the above guidance does not or cannot apply.

 

Student is absent only if they have not worked on any instructional content.

  • How will you know when a student does not have internet access, therefore need to call?
  • How do you contact the students who many not have consistent cell service or a landline?
  • What time of day will you contact students or caregivers?
  • How many attempts does a teacher or staff member need to make per day before a student is marked absent?
  • How will you address unresponsive caregivers?
  • How will you count daily attendance for different class periods in MS/HS?
  • If students have multiple content teachers, who will reach out to students?

 

In the guidance, we also considered assignment submission as a potentially viable attendance construct. An equitable implementation of an assignment submission construct across all learning environments, however, would result in one unique challenge: Would a school district be willing to mark an in-person student absent for the day if the student failed to submit an assignment? While surmountable, addressing this issue would be challenging in the short-term.

As school districts finalize their attendance measurement plans, they will need to ensure that any selected attendance measurements are feasible and sustainable for the duration of the school year for the individuals capturing attendance. This includes considering how long tracking attendance will take for teachers and additional staff members daily. Gathering feedback from teachers and staff regarding the ongoing execution of gathering attendance data is key to ensuring reliable attendance tracking within a district.

 

We welcome individuals to reach out to NCRERN with additional recommendations or considerations. We are also interested in hearing how attendance is being measured in practice at school districts across the country. Connect with NCRERN via email at ncrern@gse.harvard.edu.


Katherine Kieninger, M.P.A. is the Ohio State Network Manager for the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

David Hersh, J.D., Ph.D. is the Director of Proving Ground at the Center for Education Policy Research at Harvard University.

Jennifer Ash, Ph.D. is the Director of the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

This is the second in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19.