Skip Navigation
archived information
Stay Up-to-Date:

Home > Blog > Innovative Data Analysis Helps Boston Learn from Teachers’ Early Pandemic Experiences

Innovative Data Analysis Helps Boston Learn from Teachers’ Early Pandemic Experiences

Joshua Cox

Joshua Cox
Senior Research Associate, REL Northeast & Islands

Dr. Meg Caven

Dr. Meg Caven
Senior Research Associate, REL Northeast & Islands

Thu Feb 25 2021

This past summer REL Northeast & Islands had the opportunity to apply an innovative data analysis method—natural language processing—to inform remote teaching at Boston Public Schools in the fall. REL staff used the method to analyze teacher survey responses about the transition to remote learning in the spring. The approach reduced the amount of time needed for data analysis and could prove a promising method for examining open-ended survey data in the field of education when speed is essential.

The Challenge

After the abrupt transition to remote learning this past spring due to the COVID-19 pandemic, Boston Public Schools (BPS) district leaders wanted to learn from the experiences of their 6,200+ teachers and instructional staff. Each teacher, administrator, and staff member was sent a survey that asked them to reflect on their experiences, including effective distance learning strategies, barriers to engaging with students and families, and their professional development needs.

When the BPS research team was ready to analyze the data, however, they hit a roadblock. More than 1,700 respondents had opted to include short, narrative answers to open-ended questions. The BPS team would need to code thousands of these responses to analyze the dataset, and BPS Director of Research Apryl Clarkson and her team wanted to share the survey results with district leaders and administrators before the start of the new school year in the fall.

“We wanted to learn: What did we do well? What could we improve?” Clarkson says. “But when we received the volume of responses, we realized that we didn’t have the capacity to review them in a timely manner.”

A Novel Data Analysis Technique to the Rescue

BPS issued a call for support from partners, including REL Northeast & Islands, and we responded. When we assessed the situation, we were concerned that we could not process the data quickly enough using standard analysis methods. So, we decided to try a fresh approach: natural language processing (NLP), a data analysis technique that can sift through large amounts of text.

An application of machine learning, NLP aims to decipher and understand human language—a difficult undertaking since human language is full of ambiguity and abstraction. NLP decodes both the words and concepts, as well as the connections between them. It uses algorithms to identify syntax (the structure of a sentence) as well as semantics (the meaning). In this way, NLP can help identify common themes across large numbers of open responses. NLP is commonly used to comb through website content and social media posts and to analyze feedback responses in the corporate world, but it is a novel approach in the education domain.

Since many of the teachers’ responses were likely to be similar if not identical, we thought this method would be especially helpful in analyzing the BPS survey data. We began analysis by mapping the nine survey questions to five critical questions of interest to BPS. Next, we wrote code in Python, a multi-purpose programming language, to analyze the survey data. This method allowed us to turn the analysis around four weeks after receiving the data, cutting the analysis time in half or more.

We coupled NLP with a more traditional qualitative analysis to both confirm our findings and investigate peripheral themes. After using NLP to identify themes arising from the data, team members revisited the original survey data for context and to confirm that teachers’ responses reflected those themes. Additionally, we used qualitative analysis to more deeply investigate topics of interest that weren’t directly asked about in the district surveys. For example, BPS was interested in how teachers thought about equity in relation to remote learning and reopening. Responses that touched on equity were fewer in number, scattered throughout the dataset, and used a wide variety of language to address the issue, making analysis with NLP more difficult. Instead, we used a refined list of keywords to identify relevant open-ended responses and employed an inductive approach to distil findings related to equity.

Critical Lessons in Times of Crisis

The data revealed valuable information about the unique spring semester. Teachers reported which practices were effective, such as blending synchronous learning opportunities to foster positive connections with students and teacher-created instructional videos to deliver content. They also described significant barriers to instruction, such as a lack of engagement with some students, disparities in students’ access to technology, and difficulties communicating with parents.

In addition, the survey uncovered some unintended consequences of district policies. BPS had switched to a no grading policy when students moved to a remote learning model in the spring to promote equity across the district. But some teachers felt that there were downsides to this approach.

The goal of the policy was to guard against the unfair punishment of students by lowering their grades or holding them back a year when structural constraints, rather than effort or merit, influenced students’ ability to learn remotely. However, some teachers felt that students understood the policy to mean that virtual learning did not matter, and their participation and engagement was inconsequential. Some teachers also believed they had no way to incentivize students to do their classwork.

Other policies that intended to promote equity also came with tradeoffs. In an effort to preserve students’ privacy in their homes, the district did not require students to turn their cameras on. Without seeing students’ faces, many teachers struggled to assess whether they were engaged in the lessons. According to Clarkson, results of these analyses have urged BPS to rethink “how [...we] measure student engagement and participation through this technology. For example, if a student’s camera is off, does that mean that the student is not engaged?”

Clarkson and her research team shared the survey findings with district leaders in a series of seminars. One of the topics they explored was expanding the definition of attendance in a virtual environment to reflect whether students are submitting assignments and logging into class portals during asynchronous learning times. Clarkson also worked with the Offices of Instructional Technology and Legal Advisors to understand how to ensure and protect student privacy while engaging in remote learning.

“This survey [analysis] is showing that research isn’t just used for academic purposes,” Clarkson says. “So, how do we leverage that idea, and get research used more frequently—quite literally—in times of crisis?”

Our experience working with BPS reinforces what we already know: educational survey analysis can be a vital tool in helping districts adapt to rapidly changing conditions. There are likely many districts that face a similar dilemma of how to analyze their open-ended response data. NLP, especially when paired with traditional approaches to qualitative analysis, can potentially deliver relevant and nuanced findings in a short amount of time.