Skip Navigation
archived information
REL Appalachia

[Return to Ask A REL]

REL Appalachia Ask A REL Response

Data Use, Family and Community Engagement, Research Tools
PDF icon

June 2019

Question

What are best practices in the development and administration of customer surveys, including those that can be used by state education agencies to survey families, school and district leaders, and teachers?

Response

Thank you for your request to our REL Reference Desk regarding evidence-based information about state education agency survey customer development and administration. Ask A REL is a collaborative reference desk service provided by the 10 Regional Educational Laboratories (RELs) that, by design, functions much in the same way as a technical reference library. Ask A REL provides references, referrals, and brief responses in the form of citations in response to questions about available education research.

Following an established REL Appalachia research protocol, we searched for peer-reviewed articles and other research reports on survey development and administration. We focused on identifying resources that specifically address customer surveys that can potentially be used by state education agencies and provide examples of states that are currently administering large- scale surveys. The sources included ERIC and other federally funded databases and organizations, research institutions, academic research databases, and general Internet search engines. For more details, please see the methods section at the end of this document.

The research team did not evaluate the quality of the resources provided in this response; we offer them only for your reference. Also, the search included the most commonly used research databases and search engines to produce the references presented here, but the references are not necessarily comprehensive, and other relevant references and resources may exist. References are listed in alphabetical order, not necessarily in order of relevance.

Research References

Gross, B., & Jochim, A. (eds.). (2015). The SEA of the future: Building agency capacity for evidence-based policymaking. San Antonio, TX: Building State Capacity and Productivity Center at Edvance Research, Inc. Retrieved from https://eric.ed.gov/?id=ED562509.

From the abstract:
‘The SEA of the Future’ is an education publication series examining how state education agencies can shift from a compliance to a performance-oriented organization through strategic planning and performance management tools to meet growing demands to support education reform while improving productivity. This volume, the fifth in the series, draws on the experiences of state agency staff from Massachusetts, Michigan, and Tennessee, as well as the work of the Regional Comprehensive Centers, to explore how state education agencies can bolster their ability to use research and data to drive key spending, policy, and program decisions. Volume 5 includes practical tools state agencies can use to assist with research and data. Following an introduction by Ashley Jochim, this report contains the following essays: (1) Better Policy through Research: Pursuing High-Impact Research in State Education Agencies (Carrie Conaway)—features a sample policy analyst job description and a research office organizational chart; (2) Making Research Matter for the SEA (Nathaniel Schwartz)—describes how to build a research team within the SEA, and provides a framework and practical guidance on how research fits in the policy cycle; (3) Building Productive Research Partnerships (Venessa Keesler)—includes a case study of a research partnership: the Michigan Consortium for Educational Research; and (4) Technical Assistance to Support Evidence-Based Policymaking: A Conversation with the Regional Comprehensive Centers (facilitated by Bethany Gross)—provides a sample blueprint for states looking to create a research office, and Comprehensive Center leaders Kathleen Dempsey, Caitlin Howley, and Paul Kohler discuss their experiences in building evidence-based policy.

Irwin, C. W., & Stafford, E. T. (2016). Survey methods for educators: Collaborative survey development (part 1 of 3). (REL 2016–163) Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands. Retrieved from https://eric.ed.gov/?id=ED567751.

From the abstract:
This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey instrument that meets the policy or practice goals of the organization. Examples from collaborative survey development projects are highlighted for each step. The five-step collaborative survey development process is: (1) Step 1: Identify topics of interest; (2) Step 2: Identify relevant, existing survey items; (3) Step 3: Draft new survey items and adapt existing survey items; (4) Step 4: Review draft survey items with stakeholders and content experts; and (5) Step 5: Refine the draft survey with pretesting using cognitive interviewing. This guide is the first in a three-part series of survey method guides for educators. This guide covers survey development. The second guide in the series covers sample selection and survey administration, and the third guide in the series covers data analysis and reporting. The following are appended: (1) Additional survey development resources; (2) Sample table of specifications: Excerpt from the Early Childhood Education Research Alliance collaborative survey development project; (3) Sample analysis plan: Excerpt from the Northeast Rural Districts Research Alliance collaborative survey development project; (4) Sample feedback form: Excerpt from the Early Childhood Education Research Alliance collaborative survey development project; (5) Sample cognitive interview protocol from the Northeast Rural Districts Research Alliance collaborative survey development project; and (6) Sample cognitive interview analysis codebook from the Northeast Rural Districts Research Alliance collaborative survey development project.

Merril, L., Lafayette, C., & Goldenburg, S. (2018). Redesigning the annual NYC school survey: Lessons from a research-practice partnership. New York, NY: The Research Alliance for New York City Schools. Retrieved from https://steinhardt.nyu.edu/scmsAdmin/media/users/ks191/Compendium/Lessons_from_a_Research-Practice_Partnership.pdf.

From the abstract:
The process of redesigning the NYC School Survey has been highly collaborative. The work has exhibited the shared learning and ongoing exchange of knowledge that is characteristic of a strong research-practice partnership. Throughout the multi-year (and still ongoing) effort, researchers and practitioners have worked together to identify and build useful measures of important school capacities. Researchers have had to be responsive to the district's priorities and timelines to build a useful instrument. In turn, district leaders have given heed to the researchers' guidance about building a coherent set of instruments that are grounded in sound theory and prior evidence. The result has been a stronger set of measures to identify schools' strengths and weaknesses. Just as important, the partnership has established a process for continuing to assess the quality of the measures and modify them as needed over time.
While the redesign effort has been successful in many ways, it has also encountered challenges, especially in balancing the theoretical and methodological ambitions of the researchers against the practical and operational realities of a large, complex and heterogeneous school system. This document draws on our experience to help other school districts anticipate some of these challenges. We share lessons learned to date and highlight questions that remain about developing school surveys and using their results to inform research, policy and practice.

Pazzaglia, A. M., Stafford, E. T., & Rodriguez, S. M. (2016). Survey methods for educators: Selecting samples and administering surveys (part 2 of 3) (REL 2016–160). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands. Retrieved from https://eric.ed.gov/?id=ED567752.

From the abstract:
This guide describes a five-step collaborative process that educators can use with other educators, researchers, and content experts to write or adapt questions and develop surveys for education contexts. This process allows educators to leverage the expertise of individuals within and outside of their organization to ensure a high-quality survey instrument that meets the policy or practice goals of the organization. Examples from collaborative survey development projects are highlighted for each step. The five-step collaborative survey development process is: (1) Step 1: Identify topics of interest; (2) Step 2: Identify relevant, existing survey items; (3) Step 3: Draft new survey items and adapt existing survey items; (4) Step 4: Review draft survey items with stakeholders and content experts; and (5) Step 5: Refine the draft survey with pretesting using cognitive interviewing. This guide is the second in a three-part series of survey method guides for educators. This guide covers sample selection and survey administration. The first guide in the series covers survey development, and the third guide in the series covers data analysis and reporting. The following are appended:
  1. Additional survey sampling and administration resources;
  2. Using Microsoft Excel to obtain a random sample; (3) Sample survey invitation; and (4) Sample survey reminder.

Rudo, Z. H., & Partridge, M. A. (2016). Leadership characteristics and practices in South Carolina charter schools (REL 2017–188). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Southeast. Retrieved from https://eric.ed.gov/?id=ED570168.

From the abstract:
Charter school stakeholders in South Carolina, including officials at the South Carolina Department of Education, personnel at the Public Charter School Alliance of South Carolina, and leaders of South Carolina charter schools, expressed interest in understanding the leadership characteristics and practices of charter school leaders across the state. Stakeholders were especially interested in how charter school leaders spend their work hours, what challenges the leaders face, and who influences policies in the charter schools. Regional Educational Laboratory Southeast helped develop an online survey of characteristics and practices that was administered by the South Carolina Department of Education to leaders of all charter schools in South Carolina. Leaders at 40 of the state's 66 charter schools—1 per school—responded to the survey. This report describes the process for developing the leadership survey and provides descriptive results of the survey.

Schilpzand, E. J., Sciberras, E., Efron, D., Anderson, V., & Nicholson, J. M. (2015). Improving survey response rates from parents in school-based research using a multi-level approach. PLoS ONE, 10(5), 1–11. Retrieved from http://dro.deakin.edu.au/eserv/DU:30079143/sciberras-improvingsurvey-2015.pdf.

From the abstract:
Background: While schools can provide a comprehensive sampling frame for community-based studies of children and their families, recruitment is challenging. Multi-level approaches which engage multiple school stakeholders have been recommended but few studies have documented their effects. This paper compares the impact of a standard versus enhanced engagement approach on multiple indicators of recruitment: parent response rates, response times, reminders required and sample characteristics.
Methods: Parents and teachers were distributed a brief screening questionnaire as a first step for recruitment to a longitudinal study, with two cohorts recruited in consecutive years (cohort 1 2011, cohort 2 2012). For cohort 2, additional engagement strategies included the use of pre-notification postcards, improved study materials, and recruitment progress graphs provided to school staff. Chi-square and t-tests were used to examine cohort differences.
Results: Compared to cohort 1, a higher proportion of cohort 2 parents responded to the survey (76% versus 69%; p < 0.001), consented to participate (71% versus 56%; p < 0.001), agreed to teacher participation (90% versus 82%; p < 0.001) and agreed to follow-up contact (91% versus 80%; p < 0.001). Fewer cohort 2 parents required reminders (52% versus 63%; p < 0.001), and cohort 2 parents responded more promptly than cohort 1 parents (mean difference: 19.4 days, 95% CI: 18.0 to 20.9, p < 0.001).
Conclusion: These results illustrate the value of investing in a relatively simple multi-level strategy to maximise parent response rates, and potentially reduce recruitment time and costs.

Walston, J., Redford, J., & Bhatt, M. P. (2017). Workshop on survey methods in education research: Facilitator's guide and resources (REL 2017–214). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Midwest. Retrieved from https://eric.ed.gov/?id=ED573681.

From the abstract:
This Workshop on Survey Methods in Education Research tool consists of a facilitator guide and workshop handouts. The toolkit is intended for use by state or district education leaders and others who want to conduct training on developing and administering surveys. The facilitator guide provides materials related to various phases of the survey development process, including planning a survey, borrowing from existing surveys, writing survey items, pretesting surveys, sampling, survey administration, maximizing response rates, and measuring nonresponse bias. It also contains a section on focus groups (as part of the survey development process or as a supplementary or alternative data collection method). The materials include a sample workshop agenda, presentation handouts, activities, additional resources, and suggestions for adapting these materials to different contexts. The guide and materials were created for workshops conducted by the Regional Educational Laboratory (REL) Midwest. These workshops were developed in response to district and state education leaders in the REL Midwest Region who were interested in building agency capacity to design and conduct high-quality surveys. The following are appended: (1) Resources for the workshop on survey methods in education research; and (2) Glossary of terms and additional resources for the workshop on survey methods in education research.

Additional State Education Agency Survey Resources

Additional Ask A REL Responses to Consult

Ask A REL Mid-Atlantic at Mathematica. (2017). What student privacy issues/rules (i.e. FERPA, N size, etc.) does an SEA, LEA and school need to abide by when reporting on survey results of students, staff and parents for accountability purposes across the state? Retrieved from https://ies.ed.gov/ncee/edlabs/ regions/midatlantic/askarel_12.asp midatlantic/askarel_86.asp.

Ask A REL Mid-Atlantic at Mathematica. (2017). What are the experiences of districts and states that have administered school climate surveys to parents? Specifically, what level of response rate has been found in parent surveys, and what methods have been effective in securing higher response rates from parents? Retrieved from https://ies.ed.gov/ncee/edlabs/regions/midatlantic/askarel_18.asp.

Additional Organizations to Consult

UChicago Consortium for School Research: https://consortium.uchicago.edu/

From the website:
The UChicago Consortium conducts research of high technical quality that can inform and assess policy and practice in the Chicago Public Schools (CPS). We seek to expand communication among researchers, policymakers, and practitioners as we support the search for solutions to the problems of school reform. The UChicago Consortium encourages the use of research in policy action and improvement of practice, but does not argue for particular policies or programs. Rather, we help to build capacity for school reform by identifying what matters for student success and school improvement, creating critical indicators to chart progress, and conducting theory-driven evaluation to identify how programs and policies are working.

The Research Alliance for New York City Schools: https://steinhardt.nyu.edu/research_alliance/

From the website:
The Research Alliance conducts rigorous studies on topics that matter to the City's public schools. We strive to advance equity and excellence in education by providing nonpartisan evidence about policies and practices that promote students' development and academic success.

Central Comprehensive Center (C3): http://www.c3ta.org/index.php

From the website:
The Central Comprehensive Center (C3) at the University of Oklahoma is one of a national network of 22 federally funded centers. The C3 mission is to provide high quality/high impact technical assistance that helps build or expand the capacity of the state education agency (SEA), intermediary agencies, and other educational systems in Colorado, Kansas, and Missouri to implement, support, scale-up, and sustain reform efforts to improve teaching and learning.

Methods

Keywords and Search Strings

The following keywords and search strings were used to search the reference databases and other sources:

  • “State education agency” AND “customer survey”
  • “State education agency” AND survey AND (parent OR teacher OR “school leader”)

Databases and Resources

We searched ERIC, a free online library of more than 1.6 million citations of education research sponsored by the Institute of Education Sciences (IES), for relevant resources. Additionally, we searched the academic database ProQuest, Google Scholar, and the commercial search engine Google.

Reference Search and Selection Criteria

In reviewing resources, Reference Desk researchers consider—among other things—these four factors:

  • Date of the publication: Searches cover information available within the last 10 years, except in the case of nationally known seminal resources.
  • Reference sources: IES, nationally funded, and certain other vetted sources known for strict attention to research protocols receive highest priority. Applicable resources must be publicly available online and in English.
  • Methodology: The following methodological priorities/considerations guide the review and selection of the references: (a) study types—randomized controlled trials, quasi experiments, surveys, descriptive data analyses, literature reviews, policy briefs, etc., generally in this order; (b) target population, samples (representativeness of the target population, sample size, volunteered or randomly selected), study duration, etc.; (c) limitations, generalizability of the findings and conclusions, etc.
  • Existing knowledge base: Vetted resources (e.g., peer-reviewed research journals) are the primary focus, but the research base is occasionally slim or nonexistent. In those cases, the best resources available may include, for example, reports, white papers, guides, reviews in non-peer-reviewed journals, newspaper articles, interviews with content specialists, and organization website.

Resources included in this document were last accessed on May 13, 2019. URLs, descriptions, and content included here were current at that time.


This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Appalachian Region (Kentucky, Tennessee, Virginia, and West Virginia), which is served by the Regional Educational Laboratory Appalachia (REL AP) at SRI International. This Ask A REL response was developed by REL AP under Contract ED-IES-17-C-0004 from the U.S. Department of Education, Institute of Education Sciences, administered by SRI International. The content does not necessarily reflect the views or policies of IES or the U.S. Department of Education, nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. government.