IES Blog

Institute of Education Sciences

Guiding Principles for Successful Data Sharing Agreements

Data sharing agreements are critical to conducting research in education. They allow researchers to access data collected by state or local education agencies to examine trends, determine the effectiveness of interventions, and support agencies in their efforts to use research-based evidence in decision-making.

Yet the process for obtaining data sharing agreements with state or local agencies can be challenging and often depends on the type of data involved, state and federal laws and regulations regarding data privacy, and specific agency policies. Some agencies have a research application process and review timeline available on their websites. Others may have a more informal process for establishing such agreements. In all instances, these agreements determine how a researcher can access, use, and analyze education agency data.

What are some guiding principles for successfully obtaining data sharing agreements? 

Over several years of managing projects that require data sharing agreements, I have learned a few key principles for success. While they may seem obvious, I have witnessed data sharing agreements fall apart because one or more of these principles were not met:

  • Conduct research on a topic that is a priority for the state or local education agency. Given the time and effort agencies invest in executing a data sharing agreement and preparing data, researchers should design studies that provide essential information to the agency on a significant topic. It can be helpful to communicate exactly how and when the findings will be shared with the agency and possible actions that may result from the study findings.
  • Identify a champion within the agency. Data sharing agreements are often reviewed by some combination of program staff, legal counsel, Institutional Review Board staff, and research or data office staff. An agency staff member who champions the study can help navigate the system for a timely review and address any internal questions about the study. That champion can also help the researcher work with the agency staff who will prepare the data.
  • Be flexible and responsive. Agencies have different requirements for reviewing data sharing agreements, preparing and transferring data, securely handling data, and destroying data upon study completion. A data sharing agreement often requires some back-and-forth to finalize the terms. Researchers need to be prepared to work with their own offices and staff to meet the needs of the agency.
  • Work closely with the data office to finalize data elements and preparation. Researchers should be able to specify the sample, timeframe, data elements, and whether they require unique identifiers to merge data from multiple files. I have found it beneficial to meet with the office(s) responsible for preparing the data files in order to confirm any assumptions about the format and definitions of data elements. If the study requires data from more than one office, I recommend having a joint call to ensure that the process for pulling the data is clear and feasible to all staff involved. For example, to link student and teacher data, it might be necessary to have a joint call with the office that manages assessment data and the office that manages employment data.
  • Strive to reduce the burden on the agency. Researchers should make the process of sharing data as simple and efficient as possible for agency staff. Strategies include providing a template for the data sharing agreement, determining methods to de-identify data prior to transferring it, and offering to have the agency send separate files that the researchers can link rather than preparing the file themselves.
  • Start early. Data sharing agreements take a lot of time. Start the process as soon as possible because it always takes longer than expected. I have seen agreements executed within a month while others can take up to a year. A clear, jointly developed timeline can help ensure that the work starts on time.

What resources are available on data sharing agreements?

If you are new to data sharing agreements or want to learn more about them, here are some helpful resources:

Written by Jacqueline Zweig, Ph.D., Research Scientist, Education Development Center. Dr. Zweig is the Principal Investigator on an IES-funded research grant, Impact of an Orientation Course on Online Students' Completion Rates, and this project relies on data sharing. 

IES Research Centers are Hiring

IES is seeking professionals in education-related fields to apply for open positions in our Research Centers, National Center for Education Research (NCER) and the National Center for Special Education Research (NCSER). The Research Centers support research focused on practices and policies that improve education outcomes and access to education opportunities. Learn more about our work here: https://ies.ed.gov/ncer/ and here: https://ies.ed.gov/ncser/

If you are even potentially interested in this sort of position, we strongly encourage you to set up a profile in USAJobs (https://www.usajobs.gov/) and to upload your information now. As you build your profile, include all relevant research experience on your resume whether acquired in a paid or unpaid position. The positions will open in USAJobs on June 24, 2019 and will close as soon as 50 applications are received, or on July 8, 2019, whichever is earlier. Getting everything in can take longer than you might expect, so please apply as soon as the positions open in USAJobs (look for vacancy numbers IES-2019-0010 and IES-2019-2011).

Revenues and Expenditures for Public Schools Rebound for Third Consecutive Year in School Year 2015–16

Revenues and expenditures per pupil on elementary and secondary education increased in school year 2015–16 (fiscal year [FY] 2016), continuing a recent upward trend in the amount of money spent on public preK–12 education. This is the third consecutive year that per pupil revenues and expenditures have increased, reversing three consecutive years of declines in spending between FY 10 and FY 13 after adjusting for inflation. The findings come from the recently released Revenues and Expenditures for Public Elementary and Secondary School Districts: School Year 2015–16 (Fiscal Year 2016).

 

 

The national median of total revenues across all school districts was $12,953 per pupil in FY 16, reflecting an increase of 3.2 percent from FY 15, after adjusting for inflation.[1] This increase in revenues per pupil follows an increase of 2.0 percent for FY 15 and 1.6 percent for FY 14. These increases in revenues per pupil between FY 14 and FY 16 contrast with the decreases from FY 10 to FY 13. The national median of current expenditures per pupil was $10,881 in FY 16, reflecting an increase of 2.4 percent from FY 15. Current expenditures per pupil also increased in FY 15 (1.7 percent) and FY 14 (1.0 percent). These increases in median revenues and current expenditures per pupil between FY 14 and FY 16 represent a full recovery in education spending following the decreases from FY 10 to FY 13.

The school district finance data can help us understand differences in funding levels for various types of districts. For example, median current expenditures per pupil in independent charter school districts were lower than in noncharter and mixed charter/noncharter school districts in 21 out of the 25 states that were able to report finance data for independent charter school districts. Three of the 4 states where median current expenditures were higher for independent charter school districts had policies that affected charter school spending. The new School District Finance Survey (F-33) data offer researchers extensive opportunities to investigate local patterns of revenues and expenditures and how they relate to conditions for other districts across the country.

 

 

By Stephen Q. Cornman, NCES; Malia Howell, Stephen Wheeler, and Osei Ampadu, U.S. Census Bureau; and Lei Zhou, Activate Research


[1] In order to compare from one year to the next, revenues are converted to constant dollars, which adjusts figures for inflation. Inflation adjustments use the Consumer Price Index (CPI) published by the U.S. Department of Labor, Bureau of Labor Statistics. For comparability to fiscal education data, NCES adjusts the CPI from a calendar year basis to a school fiscal year basis (July through June). See Digest of Education Statistics 2016, table 106.70, https://nces.ed.gov/programs/digest/d16/tables/dt16_106.70.asp.

Seeking your feedback on the Regional Educational Laboratory program

IES is seeking feedback about what is working well in the current Regional Educational Laboratories (REL) program, what can be improved, and the kinds of resources and services related to evidence-based practice and data use that are most needed by educators and policymakers to improve student outcomes. We are seeking comments that are practical, specific, and actionable, and that demonstrate a familiarity with the mission and work of the RELs.

We are particularly interested in responses to these questions:

  • What types of materials or tools would be helpful to educators implementing What Works Clearinghouse Practice Guide Recommendations or other evidence-based practices? Are there other ways the RELs could make research evidence more accessible for educators and administrators?
     
  • What types of data and research support are most needed by educators and policymakers to improve student outcomes?
     
  • IES believes that robust partnerships, comprised of a diverse set of stakeholders, are critical to the translation and mobilization of evidence-based practices. Currently, research partnerships are a centerpiece of the REL program. What working models have you observed to be particularly effective in improving student outcomes?  
     
  • In what ways can RELs best serve the country as well as their designated regions?

Please send feedback to NCEE.Feedback@ed.gov by September 6, 2019. 

 

Leading experts provide evidence-based recommendations on using technology to support postsecondary student learning

By Michael Frye and Sarah Costelloe. Both are part of Abt Associates team working on the What Works Clearinghouse.

Technology is part of almost every aspect of college life. Colleges use technology to improve student retention, offer active and engaging learning, and help students become more successful learners. The What Works Clearinghouse’s latest practice guide, Using Technology to Support Postsecondary Student Learning, offers several evidence-based recommendations to help higher education instructors, instructional designers, and administrators use technology to improve student learning outcomes.

IES practice guides incorporate research, practitioner experience, and expert opinions from a panel of nationally recognized experts. The panel that developed Using Technology to Support Postsecondary Student Learning included five experts with many years of experience leading the adoption, use, and research of technology in postsecondary classrooms.  Together, guided by Abt Associates’ review of the rigorous research on the topic, the Using Technology to Support Postsecondary Student Learning offers five evidence-based recommendations:

Practice Recommendations: Use communication and collaboration tools to increase interaction among students and between students and instructors, Minimal evidence. 2. Use varied, personalized, and readily available digital resources to design and deliver instructional content, moderate evidence. 3. Incorporate technology that models and fosters self-regulated learning strategies. Moderate evidence. 4. Use technology to provide timely and targeted feedback on student performance, moderate evidence. 5. Use simulation technologies that help students engage in complex problem-solving, minimal evidence.

 

Each recommendation is assigned an evidence level of minimal, moderate, or strong. The level of evidence reflects how well the research demonstrates the effectiveness of the recommended practices. For an explanation of how levels of evidence are determined, see the Practice Guide Level of Evidence Video.   The evidence-based recommendations also include research-based strategies and examples for implementation in postsecondary settings. Together, the recommendations highlight five interconnected themes that the practice guide’s authors suggest readers consider:

  • Focus on how technology is used, not on the technology itself.

“The basic act of teaching has actually changed very little by the introduction of technology into the classroom,” said panelist MJ Bishop, “and that’s because simply introducing a new technology changes nothing unless we first understand the need it is intended to fill and how to capitalize on its unique capabilities to address that need.” Because technology evolves rapidly, understanding specific technologies is less important than understanding how technology can be used effectively in college settings. “By understanding how a learning outcome can be enhanced and supported by technologies,” said panelist Jennifer Sparrow, “the focus stays on the learner and their learning.”

  • Technology should be aligned to specific learning goals.

Every recommendation in this guide is based on one idea: finding ways to use technology to engage students and enhance their learning experiences. Technology can engage students more deeply in learning content, activate their learning processes, and provide the social connections that are key to succeeding in college and beyond. To do this effectively, any use of technology suggested in this guide must be aligned with learning goals or objectives. “Technology is not just a tool,” said Panel Chair Nada Dabbagh. “Rather, technology has specific affordances that must be recognized to use it effectively for designing learning interactions. Aligning technology affordances with learning outcomes and instructional goals is paramount to successful learning designs.”

  • Pay attention to potential issues of accessibility.

The Internet is ubiquitous, but many households—particularly low-income households and those of recent immigrants and in rural communities—may not be able to afford or otherwise access digital communications. Course materials that rely heavily on Internet access may put these students at a disadvantage. “Colleges and universities making greater use of online education need to know who their students are and what access they have to technology,” said panelist Anthony Picciano. “This practice guide makes abundantly clear that colleges and universities should be careful not to be creating digital divides.”

Instructional designers must also ensure that learning materials on course websites and course/learning management systems can accommodate students who are visually and/or hearing impaired. “Technology can greatly enhance access to education both in terms of reaching a wide student population and overcoming location barriers and in terms of accommodating students with special needs,” said Dabbagh. “Any learning design should take into consideration the capabilities and limitations of technology in supporting a diverse and inclusive audience.”

  • Technology deployments may require significant investment and coordination.

Implementing any new intervention takes training and support from administrators and teaching and learning centers. That is especially true in an environment where resources are scarce. “In reviewing the studies for this practice guide,” said Picciano, “it became abundantly clear that the deployment of technology in our colleges and universities has evolved into a major administrative undertaking. Careful planning that is comprehensive, collaborative, and continuous is needed.”

“Hardware and software infrastructure, professional development, academic and student support services, and ongoing financial investment are testing the wherewithal of even the most seasoned administrators,” said Picciano. “Yet the dynamic and changing nature of technology demands that new strategies be constantly evaluated and modifications made as needed.”

These decisions are never easy. “Decisions need to be made,” said Sparrow, “about investment cost versus opportunity cost. Additionally, when a large investment in a technology has been made, it should not be without investment in faculty development, training, and support resources to ensure that faculty, staff, and students can take full advantage of it.”

  • Rigorous research is limited and more is needed.

Despite technology’s ubiquity in college settings, rigorous research on the effects of technological interventions on student outcomes is rather limited. “It’s problematic,” said Bishop, “that research in the instructional design/educational technology field has been so focused on things, such as technologies, theories, and processes, rather than on the problems we’re trying to solve with those things, such as developing critical thinking, enhancing knowledge transfer, and addressing individual differences. It turns out to be very difficult to cross-reference the instructional design/educational technology literature with the questions the broader field of educational research is trying to answer.”

More rigorous research is needed on new technologies and how best to support instructors and administrators in using them. “For experienced researchers as well as newcomers,” said Picciano, “technology in postsecondary teaching and learning is a fertile ground for further inquiry and investigation.”

Readers of this practice guide are encouraged to adapt the advice provided to the varied contexts in which they work. The five themes discussed above serve as a lens to help readers approach the guide and decide whether and how to implement some or all of the recommendations.

Download Using Technology to Support Postsecondary Student Learning from the What Works Clearinghouse website at https://ies.ed.gov/ncee/wwc/PracticeGuide/25.