Skip Navigation
IES Conferences, Workshop/Training & Technical Assistance
Federal Committee on Statistical Methodology (FCSM)


Wednesday, October 25, 2023 | 8:30 AM

Session D-5: Approaches for Improving Survey Response: Perspectives on Incentives, Contact Strategies, and Refusal Conversion

Location: Vessey 2

You Didn’t Answer Our Survey, but What About this Text? Converting Hard-to-reach Respondents Through Text Messaging
Maura Spiegelman

As response rates decrease, new data collection methods may help persuade particularly hard-to-reach sample members to respond to surveys. In this experiment, we attempt to convert reluctant respondents by introducing text messages to other contact methods, and by offering a short, easy-to-answer survey. This experiment was conducted during the 2022 Teacher Follow-up Survey to the 2020-21 National Teacher Principal Survey, sponsored by the National Center for Education Statistics. For sample members who had not completed either a web survey or paper questionnaire after receiving 6 e-mails and 4 mailed packages, we randomly assigned them to either: receive an additional (5th) mailed package; receive a text message with a web survey URL and user ID; or receive a text message inviting them to answer a short two-way SMS text survey by responding to texted yes/no questions, in lieu of the full questionnaire. We compare whether introducing a new mode of contacting and surveying respondents, late in the data collection process, can persuade reluctant respondents to complete the questionnaire. In addition, we examine whether reluctant respondents who engage with the short two-way SMS survey can then be persuaded to complete the full survey. 

Calling all Early Birds: Testing a Deadline-Limited Incentive in a Sequential Mixed-Mode Survey 
Michelle McNamara

This presentation will report the results of an “early bird incentive” experiment conducted as part of the 2023 National Household Education Survey (NHES). Sampled households assigned to this condition were offered an additional promised incentive ($20 cash) for responding before a specific date. This deadline was selected not only to incentivize early response—but also to encourage response before the web-push survey’s eventual switch from web-only mailings to ones that included paper questionnaires. In addition to the typical cost savings associated with web response, the two-phase design of the NHES makes web response especially desirable. Web response to the first phase of the survey (a household screener) allows sampled respondents to move directly into the second phase of the survey (a detailed topical survey about one of the children’s care and education). By contrast, paper response to the first phase requires a break in the response process to allow time for the data collector to conduct within-household sampling and mail the second phase questionnaire. We will compare the early bird condition to a baseline web-push condition (identical except for the exclusion of the promised incentive offer) in terms of the response rate, response timing, mode of response, and nonresponse bias.

Session D-6: Application of Respondent-Centered Establishment Survey Design Principles

Location: Room 0105

Respondent-Centered Establishment Survey Design Principles: An Overview 
Maura Spiegelman

This presentation lays out the four pillars of respondent-centered establishment survey design principles, 51 including best practices for design decisions. These best practices have been developed by an interagency team of survey methodologists specializing in establishment survey testing. At the conclusion, we will lay out the Annual Integrated Economic Survey – the bringing together of seven annual Census Bureau surveys into one instrument – and the respondent-centered research agenda that has informed design decisions.

Wednesday, October 25, 2023 | 1:45 PM

Session F-3: Advancements in Sexual Orientation and Gender Identity Measurement

Just to Confirm: Evaluating the Reliability and Validity of Survey Questions on Sex and Gender
Elise Christopher, David Richard, and Maura Spiegelman

Surveys may ask respondents to confirm responses to sex assigned at birth and gender to minimize response errors, which can lead cisgender respondents to be falsely categorized as gender minorities, inflate population estimates, and yield inaccurate conclusions about transgender individuals. However, probing individuals who report a gender that is different from their sex assigned at birth to confirm this pair of answers can make transgender respondents feel singled out or “othered.” Asking the same confirmation of all respondents, including those who report the same answer for their sex and gender, does not necessarily reduce that stigma for transgender individuals, since they would not know that all respondents were asked to confirm their answers. We tested two separate confirmation questions, asking respondents about their sex assigned at birth, to confirm their answer, then about their gender, and to again confirm their answer. Respondents were re-interviewed 3 weeks later to confirm their responses. We assess how frequently respondents changed an answer, whether their responses indicate they are cisgender or transgender, and whether they provide the same responses during re-interview. These findings will inform whether asking two confirmation questions of all respondents can increase data quality without causing respondents to feel singled out.

Session F-4: Democratizing Data: A Search and Discovery Platform for Public Data Assets

Location: Vessey 1

Forging New Partnerships: A vision for Education Statistics
Josh DeLaRosa

Thursday, October 26, 2023 | 8:30 AM

Session H-5: Advancing the Federal Statistical Ecosystem TODAY! A ‘How-to’ Session for Supporting the FSS

Organizer: Jennifer Nielsen
Location: Vessey 2

FCSM: A Brief Overview of FCSM and Ways You Can Get Involved
Jennifer Nielsen

Thursday, October 26, 2023 | 3:30 PM

Session K-1: Advances on Federal SOGISC Data: The Federal Evidence Agenda, OMB Guidance, and the NASEM Report

Chair: Elise Christopher
Location: Chesapeake A

Session K-2: Collecting Race and Ethnicity in Establishment Surveys: Agency Methods and Results

Location: Chesapeake B

Public Schools’ Student and Teacher Race/Ethnicity Data: Findings from the School Pulse Panel
Rebecca Bielamowicz, Josue Delarosa, and Ryan Iaconelli

In response to the proposed revisions to SPD 15, the National Center for Education Statistics (NCES) surveyed school principals about how race and ethnicity data are collected on their students and teachers to assess schools’ ability to collect data in line with the proposed changes. Questions were fielded in February 2023 as a part of the School Pulse Panel (SPP) and asked principals about when student and teacher race/ethnicity data are collected, how it is reported, whether their information systems already include information about whether students and teachers are Middle Eastern or North African, and whether their information systems have detailed ethnic categories. This presentation will share these results from the SPP and consider the implications the proposed revisions may have on schools’ ability to collect and report these data.

Type: Conferences
Location: University of Maryland
College Park, MD
Dates: October 24-26, 2023
Organization: National Center for Education Statistics