Project Activities
The researchers conducted a systematic review of experimental and quasi-experimental evaluations of educational technology programs from 2004 to 2024 that measure impacts on students’ mathematics outcomes. The meta-analysis examined (a) the overall effect of educational technology on mathematics learning, (b) the study and research features that moderate effects, and (c) the features of the program that moderate effects.
Structured Abstract
Setting
The systematic review included evaluations of educational technology programs conducted in grades K–12 education settings and locations across the U.S., and in countries whose education systems were comparable to the U.S. such as Canada, countries in Europe, Israel, Australia, and New Zealand.
Sample
The review included evaluations of students in grades K-12, in all types of schools in the U.S. and in countries whose education systems were comparable to the U.S.
The review included studies that tested a broad range of educational technology programs, including products, applications, and instructional strategies supported by technology to improve mathematics learning. The educational technologies included intelligent tutoring systems; virtual reality; media-infused instruction; or computer-assisted instruction, cooperative learning, games, simulations, inquiry/discovery, or project-based learning. The researchers also examined study conditions that included characteristics of students (such as age and socioeconomic status), instructional focus (arithmetic, geometry, problem solving, etc.), and type of community (urban, suburban, rural; disadvantaged/advantaged). They also examined study design and methodological features, along with specific features of the programs evaluated.
Research design and methods
The researchers developed a database of studies from 2004 through February 2024 using four strategies in order to locate as many studies as possible that might meet the inclusion criteria. First, they conducted electronic searches of education databases using subject-related keywords focusing on educational technology and mathematics and study-methodology related keywords for experimental and quasi-experimental designs. Then, they conducted searches beyond the electronic databases to include Google Scholar and other Internet search engines, education publisher websites, and third-party evaluator websites. The third strategy included examining both citations from identified studies and previous reviews of educational technology interventions and also studies mentioned in IES reports or in the IES What Works Clearinghouse or that were cited in Education Innovation and Research (EIR) program reports. For the fourth strategy, the researchers contacted the authors of identified articles and publishers of educational technology interventions to help identify any remaining studies that may have been missed.
To be included in the meta-analysis, studies must have been experimental or quasi-experimental evaluations of educational technology programs with impacts on mathematics outcomes studies. Studies also must have a duration of at least 8 weeks. Qualifying studies were accepted from the U.S., Canada, countries in Europe, Israel, Australia, and New Zealand. Studies must have used intent-to-treat designs, meaning that all students in the original sample are included in the final analysis. All coding was done by independent pairs of reviewers, who discussed disagreements and came to consensus, involving other authors if needed.
The researchers conducted the meta-analysis through five key steps: (a) retrieve all potential studies, (b) screen and review studies by pre-set criteria, (c) code data and features of qualified studies, (d) compute effect sizes, and (e) implement statistical analyses.
Control condition
Included studies must have included a business-as-usual or alternative treatment condition.
Key measures
The key outcomes in this project were quantitative measures of mathematics learning. For each study there may have been multiple outcomes in multiple domains of mathematics. The researchers coded all outcomes from each study for inclusion and did not include measures of mathematics learning that were created by the study’s researchers.
Data analytic strategy
The researchers extracted effect sizes from individual studies. If the study did not report effect sizes, the researchers calculated the Cohen’s d effect size and converted them to Hedges’ g to account for small sample bias. The researchers also verified studies’ reported effect sizes. Several studies included multiple acceptable outcome measures (such as formative and summative assessments) and, therefore, would have multiple effect sizes. In such cases, the researchers calculated the weighted average effect size across measures of the same outcome type because effect sizes within a single study are likely to be correlated and not independent. The researchers used Comprehensive Meta- Analysis (CMA) software version 4.0 to calculate effect sizes and conduct meta-analytical tests.
Key outcomes
The main findings of this project are as follows (Morrison, Borokhovski, Bernard, & Slavin, 2024):
- The use of educational technology programs had a small but significant positive effect on students’ mathematics learning.
- Educational technology programs were less effective for students from low socio-economic status backgrounds.
- Study and research design features significantly moderated the impact of educational technology programs on students’ mathematics learning. Specifically, studies that used a quasi-experimental design, were funded by the developer of the intervention, or had smaller sample sizes showed stronger effects.
- Specific features of the program also significantly moderated the impact of the education technology program on students’ mathematics learning. Education technology programs that were implemented at moderate levels of intensity (30 to 75 minutes per week), blended programs that included a technology component and a print-based component, or programs that were implemented with high levels of fidelity had stronger effects on students’ mathematics learning. In addition, education technology programs that employed personalization by adapting to learners’ responses showed stronger effects on students’ mathematics learning.
People and institutions involved
IES program contact(s)
Project contributors
Products and publications
Study registration:
Publications:
ERIC Citations: Find available citations in ERIC for this award here.
Select Publications:
Morrison, J. R., Borokhovski, E., Bernard, R. M., & Slavin, R. E. (2024). A quantitative synthesis of outcomes of educational technology approaches in K-12 mathematics. Center for Research and Reform in Education, The Johns Hopkins University.
Available data:
Data from this project is available here: https://osf.io/qhyvz/.
Supplemental information
Co-Principal Investigators: Inns, Amanda; Pape, Stephen
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.