Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Assessing Teacher Applicants: What Can be Learned About Inservice Teacher Quality and Retention by Applying AI to Applicant Short-Answer Questions
Center: NCER Year: 2022
Principal Investigator: Sojourner, Aaron Awardee: W.E. Upjohn Institute for Employment Research
Program: Teaching, Teachers, and the Education Workforce      [Program Details]
Award Period: 4 years (07/01/2022 – 06/30/2026) Award Amount: $1,378,946
Type: Exploration Award Number: R305A220479
Description:

Previous Award Number: R305A220228
Previous Awardee: University of Minnesota

Co-Principal Investigators: Goldhaber, Dan; Mykerezi, Elton; Dachille, Lauren; Grout, Cyrus

Purpose: The importance of teachers to student success is hard to overstate. Improving school's ability to select more effective teachers is an obvious avenue for improving their workforce. This exploration study will contribute to a small but promising recent body of research on teacher selection, which shows that districts have choice amongst applicants, and can obtain information during the application process that is predictive of both teacher effectiveness and retention. One widely used selection practice that has not been examined empirically is eliciting information from applicants via written responses to short-answer questions about various aspects of their teaching style, classroom management, and views on student learning and family engagement. In this project, researchers will use unique applicant data from multiple school districts and Charter Management Organizations (CMOs) that use the same applicant tracking system provided by Nimble (a data collection firm specializing in hiring data). The applicant data will be linked with administrative data on teacher and student outcomes to better understand the connections between information derived from applicants' written responses to short-answer questions, district hiring decisions, and the outcomes of teachers and their students.

Project Activities: The data for this analysis will come from teacher applicants to 3 urban school districts and 4 CMOs partnering with Nimble. The team will use advances in Artificial Intelligence (AI) to generate quantitative teacher applicant measures based on the short answer responses that applicants submit to school districts combined with in-service teacher and student outcomes. Specifically, researchers will measure response length, clarity/readability of text and a grammatical score based on spelling and grammar rules. Additionally, the content will be interpreted through Natural Language Processing (NLP) techniques that can analyze the topics of discussion in unstructured text, can classify objects of discussion into categories, and can detect affect (positive vs negative feelings expressed) and concreteness of language used. These indexes will then be related to data on student success, teacher performance ratings, teacher turnover, and applicant job offers. Subsample analysis will examine if information from these questions can predict success among students of color differently and if it shows significant differences across teacher applicant demographic groups. The analysis will show the degree to which the information in the short-answer responses is influential in the hiring process and if it is predictive of success (either in terms of improving student outcomes and/or improving equity).

Products: The research team will produce research reports, conference presentations, and peer-reviewed journal articles that: 1) Describe short-answer questions and their use and demonstrate how to use advances in AI to extract meaningful quantitative information from large numbers of applicant text responses; and 2) describe the relationship between information extracted from these questions and student success, teacher performance ratings, teacher turnover, and applicant job offers. Additionally, NLP programs and algorithms used will be made available for future research.

Structured Abstract

Setting: This study will use teacher applicant, teacher, and student data from 7 partnering organizations: the School District of Palm Beach County, FL (serving approximately 200,000 students); Indianapolis Public Schools (nearly 32,000 students) and Saint Louis Public Schools (nearly 24,000 students), and 4 CMOs (Breakthrough Charter Schools, OH; Compass Rose Public Schools, TX; ICEF Public Schools, CA; Ace Charter Schools, CA) jointly serving about 10,000 students.

Sample: The applicant-outcome linked data will come from 7 partnering organizations that annually collect information from about 17,000 teacher applicants to fill approximately 2,700 job vacancies. Throughout the study, researchers will observe 3-5 applicant cohorts (differing for each partner on how long they have used Nimble), corresponding to the outcomes of about 8,500 teachers. Approximately 2,000 of these teachers will be responsible for student progress in math or reading.

Factors: The primary focus of analysis is  unstructured text data from applicant response to short-answer questions in writing at the interview stage. There are 20 such questions that address issues such as the applicant's classroom management style, philosophy on teaching and learning, and how they engage and create an inclusive environment for students.

Research Design and Methods: First, the research team will analyze each question answered by teacher applicants for language use and content. They will use language use measures based on length, language readability and grammatical scoring. Content analysis measures will be created using various Artificial Intelligence (AI) techniques that can identify topics in unstructured text, can classify objects or topics of discussion into categories, can detect positive and negative affect in written language and score language for concreteness (vs abstract). Researchers will use ordinary least squares (OLS) and logit regressions to link these measures to teacher and student outcomes.

Control Condition: Due to the nature of the study, there is no control condition.

Key Measures: Researchers will use the various measures of teacher responses as primary variables of interest and will relate these to outcome measures including student test achievement, year-to-year change on non-test outcomes, teacher performance ratings, teacher retention, and applicant job offers.

Data Analytic Strategy: Student outcome regressions will follow a specification where student's current values of test and non-test outcomes are estimated controlling for prior values and student characteristics as well as the information from the short-answer questions (and other teacher observables). Job offer, teacher performance, and teacher retention models will control for applicant response information and other teacher applicant/teacher, and school and district/CMO characteristics. Finally, sub-group analysis will interact the indexes describing short-answer questions with student and teacher demographics to test for heterogeneous effects.

Related IES Projects:  Applicants at the Doorstep: Improving Hiring Practices through a Better Understanding of the Link Between Applicant Information and Teacher Quality (R305H130030); Research Partnership to Improve Human Resource Management in Minneapolis Public Schools (R305H160036)


Back