Skip Navigation

REL Central Ask A REL Response

Data Use

January 2016


What does the research say about data-driven instruction?


Following an established REL Central research protocol, we conducted a search for research reports as well as descriptive study articles to help answer the question. The sources included ERIC and other federally funded databases and organizations, research institutions, academic research databases, and general Internet search engines. (For details, please see the methods section at the end of this memo.)

We have not evaluated the quality of references and the resources provided in this response, and we offer them only for your reference. Also, we searched the references in the response from the most commonly used resources of research, but they are not comprehensive and other relevant references and resources may exist.

Research References

Carlson, D., Borman, G. D., & Robinson, M. (2011). A multistate district-level cluster randomized trial of the impact of data-driven reform on reading and mathematics achievement. Educational Evaluation and Policy Analysis, 33(3), 378–398. Retrieved from
Full text available

From the abstract:

“Analyzing mathematics and reading achievement outcomes from a district-level random assignment study fielded in over 500 schools within 59 school districts and seven states, the authors estimate the 1-year impacts of a data-driven reform initiative implemented by the Johns Hopkins Center for Data-Driven Reform in Education (CDDRE). CDDRE consultants work with districts to implement quarterly student benchmark assessments and provide district and school leaders with extensive training on interpreting and using the data to guide reform. Relative to a control condition, in which districts operated as usual without CDDRE services, the data-driven reform initiative caused statistically significant districtwide improvements in student mathematics achievement. The CDDRE intervention also had a positive effect on reading achievement, but the estimates fell short of conventional levels of statistical significance.”

Cavalluzzo, L., Geraghty, T. M., Steele, J. L., Holian, L., Jenkins, F., Alexander, J. M., & Yamasaki, K. Y. (2014). Using data to inform decisions: How teachers use data to inform practice and improve student performance in mathematics. Results from a randomized experiment of program efficacy. Arlington, VA: CNA Corporation. Retrieved from

From the abstract:

“The purpose of this study is to evaluate, using a randomized experimental design, the efficacy of TERC’s “Using Data” program to change teacher behavior and improve student learning outcomes. The “Using Data” intervention provides professional development and technical assistance to teachers to help them use data collaboratively to identify and solve systemic student learning problems. The intervention was implemented by school-based data teams composed of a designated data coach and four grade 4 and 5 math teachers in Duval County Public Schools, a large, urban school district serving Jacksonville, Florida, during school years 2011-12 and 2012-13. In the first year of the study, teachers in the treatment group participated in professional development events and technical assistance sessions that exposed them to “Using Data” and helped them implement its processes. In the second year the teachers received additional assistance with implementation. Sixty (60) schools were recruited to participate in the study. The authors conclude that the weight of the evidence presented in this report indicates that ”Using Data“ improves teachers' outcomes after one year, and improves the outcomes of their students in high-needs schools after two years.”

Faria, A., Heppen, J., Li,Y., Stachel, S., Jones, W., Sawyer, K., Thomsen, K., Kutner, M., Miser, D., Lewis, S., Casserly, M., Simon, C., Uzzell, R., Corcoran, A., & Palacios, M. (2012). Charting success: Data use and student achievement in urban schools (Abstract). Washington, DC: Council of the Great City Schools. Retrieved from

From the introduction:

“In October 2008, the Council of the Great City Schools and American Institutes for Research (AIR) launched a project funded by The Bill & Melinda Gates Foundation that focused on understanding the use of interim assessment data as a lever for instructional improvement. The goals of this project were to (1) document and understand current interim assessment data-use practices in urban school districts and (2) to test the links between data-use practices and perceptions and student achievement. This abstract is a summary of the report that focused on the second objective: examining the empirical relationships between teacher- and school-level data use and student achievement in mathematics and reading in a study conducted in four geographically varied urban school districts. By examining the extent to which certain data-use practices are related to student achievement, this study expands on the existing body of literature on the use of interim assessments to drive instructional improvement. The full report, Charting Success: Data Use and Student Achievement in Urban Schools, can be found at, along with its companion pieces, Using Data to Improve Instruction in the Great City Schools: Documenting Current Practices and Using Data to Improve Instruction in the Great City Schools: Key Dimensions of Practice.”

Hamilton, L., Halverson, R., Jackson, S., Mandinach, E., Supovitz, J., & Wayman, J. (2009). Using student achievement data to support instructional decision making (NCEE 2009-4067). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education. Retrieved from

From the abstract:

“This guide offers five recommendations to help educators effectively use data to monitor students’ academic progress and evaluate instructional practices. The guide recommends that schools set a clear vision for schoolwide data use, develop a data-driven culture, and make data part of an ongoing cycle of instructional improvement. The guide also recommends teaching students how to use their own data to set learning goals.”

Kirkup, C., Sizmur, J., Sturman, L., & Lewis, K. (2005). Schools’ use of data in teaching and learning (Research Report RR 671). Berkshire, UK: National Foundation for Educational Research. Retrieved from

From the introduction:

“The National Foundation for Educational Research (NFER) was commissioned by the Department for Education and Skills (DfES) to conduct a study of primary, secondary and special maintained schools in England to assess the use of data in teaching and learning. The aims of the study were: to identify how data is used to promote learning in primary, middle, secondary and special maintained schools in England; to identify good practice in the effective use of data to promote learning; to investigate possible challenges to using data of this nature; to provide recommendations to support school staff in making effective use of data to promote learning, including the future development of the Pupil Achievement Tracker (PAT). The ethos or assumption underlying the provision of data to schools is that such information leads to improvements in performance. This study gathered evidence as to the impact of the use of data in schools and the perceptions of users as to how successful this has been in raising attainment.”

Ryan, M. M. (2014). The impact collaborative data analysis has on student achievement and teacher practice in high school mathematics classrooms in suburban school districts in the Mid-West Region of New York (Doctoral Dissertation). Pittsford, NY: St. John Fisher College. Retrieved from

From the abstract:

“This research study is an examination of ongoing collaborative data analysis among educators and the potential impact it has on instructional improvement as well as student achievement. Collaborative data driven decision making has been identified in theory and research as a promising model for continuous school improvement yet districts, schools and teachers are hesitant to change traditional practices (DuFour, Eaker & DuFour, 2005; Gruenert, 2005; Steele & Boudett, 2008). The purpose of this study was to reveal how integrating formative and summative assessments, collecting and analyzing data, and collaborating as teams expands teacher understanding of data driven decision making and leads to improved teaching practices. A mixed methods research design was chosen for this study to better understand the research problem by triangulating numeric trends from quantitative data and the detail of qualitative data. A quasi-experimental approach was used to measure the relationship between collaborative data analysis and student achievement, as well as the progress a school is making with the implementation of data-driven instruction and assessment. At the same time, interviews were conducted to explore teacher’s views on the implementation and effectiveness of collaborative data analysis with respect to their instructional practices and student learning. The findings suggest that when teachers are provided structured time within the school day, meaningful collaborative data analysis that leads to instructional adjustments and targeted student interventions can occur. The need for additional research studies vii that investigate grade level or content area collaborative inquiry teams impact on student performance based on both formative and summative assessments was identified.”

Slavin, R.E., Cheung, A., Homes, G., Madden, N.A., & Chamberlain, A. (2011). Effects of a data-driven district reform model on state assessments. Baltimore, MD: Success for All Foundation. Retrieved from

From the abstract:

“A district-level reform model created by the Center for Data-Driven Reform in Education (CDDRE) provided consultation with district leaders on strategic use of data and selection of proven programs. 59 districts in seven states were randomly assigned to start CDDRE services either immediately or one year later. In addition, individual schools in each participating district were matched with control schools. Few important differences on state tests were found 1 and 2 years after CDDRE services began. The randomized design found positive effects on reading and math in fifth and eighth grade by Year 4. In the matched evaluation, positive, significant effects were seen on reading scores for fifth and eighth graders in Years 3 and 4. Effects were much larger for schools that selected proven programs than for those that did not.”

Sun, J., Przybylski, R., & Johnson, B. J. (2016). A review of research on teachers’ use of student data: from the perspective of school leadership. Educational Assessment, Evaluation and Accountability, 28(1), 5–33. Retrieved from
Full text available

From the abstract:

“Despite the increased worldwide acknowledgment of the importance of teachers’ use of formative and/or summative assessment data to improve teaching and learning, empirical research on its impacts on student learning is sparse. Even more so is the lack of studies on the best ways for school leaders to develop teachers’ capacity. Teachers generally have low efficacy in using student data to inform their day-to-day instructions. Teachers lack the basic skills to understand, interpret, and analyze data, develop instructional strategies based on data, and implement research-based instructional strategies in classrooms to address the weaknesses reflected from data analysis results. Any gap in this chain of instructional actions would lead to ineffective teaching and learning in classrooms. This study synthesizes research located from on-line databases on teachers’ data use conducted in the last 14 years and examines the nature, impacts, and shapers of teachers’ use of student formative and/or summative assessment data to improve teaching and learning. This review provides a much-needed guide to school leaders and policy makers in the USA, as well to other jurisdictions that want to make evidence-based decisions in the hopes of improving student learning and teachers’ capacity in data use.”

Additional Organizations to Consult

WestEd, Data for Decisions Research:

From the website:

“The Data for Decisions Initiative (DDI) seeks to help education stakeholders–including educators, policymakers, and researchers–access solution-driven tools, resources, and research to inform their practice and develop a better understanding of how high-quality data use can successfully inform teaching and learning.

This page offers foundational research in the field–that is, research that focuses on the critical components of data for decision making, such as data systems, data teams, data coaches, creating a data culture, vision, and the need for leadership. It seeks to understand data for decision making and data use in education.”


Keywords and Strings

The following keywords and search strings were used to search the reference databases and other sources:

  • Impact data driven instruction
  • Student data math achievement
  • Use of student data math

Databases and Resources

We searched ERIC for relevant resources. ERIC is a free online library of over 1.6 million citations of education research sponsored by the Institute of Education Sciences. Additionally, we searched Google Scholar and Google.

Reference Search and Selection Criteria

When searching and reviewing resources, we considered the following criteria:

  • Date of the Publication: References and resources published for last 7 years, from 2010 to present, were include in the search and review.
  • Search Priorities of Reference Sources: Search priority is given to study reports, briefs, and other documents that are published and/or reviewed by IES and other federal or federally funded organizations. ERIC is the next priority, followed by academic databases, including EBSCO, JSTOR, PsychInfo, PsychArticle, and Google Scholar.
  • Methodology: The following methodological priorities/considerations were used in the review and selection of the references: (a) currency of available data, (b) study types–randomized control trials,, quasi experiments, surveys, descriptive data analyses, literature reviews, policy briefs, etc., generally in this order: (c) target population, samples (representativeness of the target population, sample size, volunteered or randomly selected, etc.), study duration, etc.

This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Central Region (Colorado, Kansas, Missouri, Nebraska, North Dakota, South Dakota, Wyoming), which is served by the Regional Educational Laboratory Central at Marzano Research. This memorandum was prepared by REL Central under a contract with the U.S. Department of Education’s Institute of Education Sciences (IES), Contract ED-IES-17-C-0005, administered by Marzano Research. Its content does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.