Skip Navigation


Follow us on:

Ask A REL Response

September 2018

Question

What research has been conducted on data-driven differentiated instruction?

Response

Following an established REL Southeast research protocol, we conducted a search for research reports as well as descriptive study articles on data-driven differentiated instruction. We focused on identifying resources that specifically addressed data-driven differentiated instruction. The sources included ERIC and other federally funded databases and organizations, research institutions, academic research databases, and general Internet search engines (For details, please see the methods section at the end of this memo.)

We have not evaluated the quality of references and the resources provided in this response. We offer them only for your reference. These references are listed in alphabetical order, not necessarily in order of relevance. Also, we searched the references in the response from the most commonly used resources of research, but they are not comprehensive and other relevant references and resources may exist.

Research References

  1. Cordray, D. S., Pion, G. M., Brandt, C., & Molefe, A. (2013). The impact of the measures of academic progress (map) program on student reading achievement. Society for Research on Educational Effectiveness. http://eric.ed.gov/?id=ED564093
    From the abstract: "One of the most widely used commercially available systems incorporating benchmark assessment and training in differentiated instruction is the Northwest Evaluation Association's (NWEA) Measures of Academic Progress (MAP) program. The MAP program involves two components: (1) computer-adaptive assessments administered to students three to four times per year, and (2) teacher training and access to MAP resources on how to use data from these assessments to differentiate instruction. This report focuses on the program's impact after the second year of implementation, and seeks to answer the following questions on implementation fidelity and student outcomes: (1) Were MAP resources (training, consultation, web-based materials) delivered by NWEA and received and used by teachers as planned?; (2) Did MAP teachers apply differentiated instructional practices in their classes to a greater extent than their control counterparts?; (3) Did the MAP program (that is, training plus benchmark testing feedback) affect the reading achievement of grades 4 and 5 students after the second year of implementation, as measured by the Illinois Standards Achievement Test (ISAT) reading scale scores or the MAP composite test scores in reading and language use?; and (4) Were there variations in the impacts of the MAP intervention on grades 4 or 5 ISAT reading and MAP composite scores across subgroups of students after the second year of implementation? The study focused on grade 4 and 5 students in 32 public elementary schools across five districts in Illinois. The differential impact among low and high ability students at grade 4 suggests that the MAP program may have the greatest impact on low and high ability students."
  2. Faber, J. M., Glas, C. A. W., & Visscher, A. J. (2018). Differentiated instruction in a data-based decision-making context. School Effectiveness and School Improvement, 29(1), 43-63. http://eric.ed.gov/?id=EJ1170495
    From the abstract: "In this study, the relationship between differentiated instruction, as an element of data-based decision making, and student achievement was examined. Classroom observations (n = 144) were used to measure teachers' differentiated instruction practices and to predict the mathematical achievement of 2nd- and 5th-grade students (n = 953). The analysis of classroom observation data was based on a combination of generalizability theory and item response theory, and student achievement effects were determined by means of multilevel analysis. No significant positive effects were found for differentiated instruction practices. Furthermore, findings showed that students in low-ability groups profited less from differentiated instruction than students in average or high-ability groups. Nevertheless, the findings, data collection, and data-analysis procedures of this study contribute to the study of classroom observation and the measurement of differentiated instruction."
  3. Maeng, J. L. (2017). Using technology to facilitate differentiated high school science instruction. Research in Science Education, 47(5), 1075-1099. http://eric.ed.gov/?id=EJ1152920
    From the abstract: "This qualitative investigation explored the beliefs and practices of one secondary science teacher, Diane, who differentiated instruction and studied how technology facilitated her differentiation. Diane was selected based on the results of a previous study, in which data indicated that Diane understood how to design and implement proactively planned, flexible, engaging instructional activities in response to students' learning needs better than the other study participants. Data for the present study included 3 h of semi-structured interview responses, 37.5 h of observations of science instruction, and other artifacts such as instructional materials. This variety of data allowed for triangulation of the evidence. Data were analyzed using a constant comparative approach. Results indicated that technology played an integral role in Diane's planning and implementation of differentiated science lessons. The technology-enhanced differentiated lessons employed by Diane typically attended to students' different learning profiles or interest through modification of process or product. This study provides practical strategies for science teachers beginning to differentiate instruction, and recommendations for science teacher educators and school and district administrators. Future research should explore student outcomes, supports for effective formative assessment, and technology-enhanced readiness differentiation among secondary science teachers."
  4. Park, V., & Datnow, A. (2017). Ability grouping and differentiated instruction in an era of datadriven decision making. American Journal of Education, 132(2), 281-306. http://eric.ed.gov/?id=EJ1129300
    From the abstract: "Despite data-driven decision making being a ubiquitous part of policy and school reform efforts, little is known about how teachers use data for instructional decision making. Drawing on data from a qualitative case study of four elementary schools, we examine the logic and patterns of teacher decision making about differentiation and ability grouping. We find that district and school policies conditioned teachers' decision making through mandated time for instructional differentiation, curricular tools, and online program adoption. Educators used various strategies reflecting different logics, types of data used, and sources of decision making. Implications for theory and research are discussed."
  5. Schifter, C. C., Natarajan, U., Ketelhut, D. J., & Kirchgessner, A. (2014). Data-driven decisionmaking: Facilitating teacher use of student data to inform classroom instruction. Contemporary Issues in Technology and Teacher Education (CITE Journal),14 (4), 419-432. http://eric.ed.gov/?id=EJ1058372
    From the abstract: "Data-driven decision making is essential in K-12 education today, but teachers often do not know how to make use of extensive data sets. Research shows that teachers are not taught how to use extensive data (i.e., multiple data sets) to reflect on student progress or to differentiate instruction. This paper presents a process used in an National Science Foundation (NSF) funded project to help middle-grade science teachers use elaborate and diverse data from virtual environment game modules designed for assessment of science inquiry. The NSF-funded project dashboard is presented, along with results showing promise for a model of training teachers to use data from the dashboard and data-driven decision making principles, to identify science misunderstandings, and to use the data to design lesson options to address those misunderstandings."
  6. Tomlinson, C. A., Brighton, C., Hertberg, H., Callahan, C. M., Moon, T. R., Brimijoin, K., Conover, L. A., & Reynolds, T. (2003). Differentiating instruction in response to student readiness, interest, and learning profile in academically diverse classrooms: A review of literature. Journal for the Education of the Gifted, 27(2-3), 119–145. http://eric.ed.gov/?id=EJ787917
    From the abstract: "Both the current school reform and standards movements call for enhanced quality of instruction for all learners. Recent emphases on heterogeneity, special education inclusion, and reduction in out-of-class services for gifted learners, combined with escalations in cultural diversity in classrooms, make the challenge of serving academically diverse learners in regular classrooms seem an inevitable part of a teacher's role. Nonetheless, indications are that most teachers make few proactive modifications based on learner variance. This review of literature examines a need for 'differentiated' or academically responsive instruction. It provides support in theory and research for differentiating instruction based on a model of addressing student readiness, interest, and learning profile for a broad range of learners in mixed-ability classroom settings."
  7. Wachen, J., Harrison, C., & Cohen-Vogel, L. (2018). Data use as instructional reform: exploring educators' reports of classroom practice, Leadership and Policy in Schools, 17(20, 296-325. http://eric.ed.gov/?id=EJ1179446
    From the abstract: "Through policies like No Child Left Behind, the federal government incentivized the use of student performance data as a core strategy for improving student achievement. The assumption behind these efforts is that data will be used to guide teacher practice and promote high-quality instruction. This study examined how teachers describe using data in their instructional practices. Findings reveal that few teachers were able to articulate an ability to bridge the divide between using data to identify students in need of help and using data to modify instruction. We discuss factors that supported or impeded educators' use of data."

Methods

Keywords and Search Strings
The following keywords and search strings were used to search the reference databases and other sources:

  • data driven differentiated instruction
  • data analysis, instructional strategies

Databases and Resources
We searched ERIC for relevant resources. ERIC is a free online library of over 1.6 million citations of education research sponsored by the Institute of Education Sciences. Additionally, we searched Google Scholar and PsychInfo.

Reference Search and Selection Criteria

When we were searching and reviewing resources, we considered the following criteria:

  • References and resources published for last 15 years, from 2003 to present, were include in the search and review.
  • Search Priorities of Reference Sources: Search priority is given to study reports, briefs, and other documents that are published and/or reviewed by IES and other federal or federally funded organizations, academic databases, including ERIC, EBSCO databases, JSTOR database, PsychInfo, PsychArticle, and Google Scholar.
  • Methodology: Following methodological priorities/considerations were given in the review and selection of the references: (a) study types – randomized control trials, quasi experiments, surveys, descriptive data analyses, literature reviews, policy briefs, etc., generally in this order (b) target population, samples (representativeness of the target population, sample size, volunteered or randomly selected, etc.), study duration, etc. (c) limitations, generalizability of the findings and conclusions, etc.

This memorandum is one in a series of quick-turnaround responses to specific questions posed by educational stakeholders in the Southeast Region (Alabama, Florida, Georgia, Mississippi, North Carolina, and South Carolina), which is served by the Regional Educational Laboratory Southeast at Florida State University. This memorandum was prepared by REL Southeast under a contract with the U.S. Department of Education’s Institute of Education Sciences (IES), Contract ED-IES-17-C-0011, administered by Florida State University. Its content does not necessarily reflect the views or policies of IES or the U.S. Department of Education nor does mention of trade names, commercial products, or organizations imply endorsement by the U.S. Government.