Skip to main content

Breadcrumb

Home arrow_forward_ios Information on IES-Funded Research arrow_forward_ios A Quantitative Synthesis of Outcome ...
Home arrow_forward_ios ... arrow_forward_ios A Quantitative Synthesis of Outcome ...
Information on IES-Funded Research
Grant Closed

A Quantitative Synthesis of Outcomes of Educational Technology Approaches to K-12 Mathematics

NCER
Program: Education Research Grants
Program topic(s): Science, Technology, Engineering, and Mathematics (STEM) Education
Award amount: $599,966
Principal investigator: Jennifer R. Morrison
Awardee:
Johns Hopkins University
Year: 2021
Award period: 3 years (07/01/2021 - 06/30/2024)
Project type:
Exploration
Award number: R305A210186

Purpose

In this project, the researchers conducted a systematic review of evaluations that use technology-based programs, including products, applications, and instructional strategies supported by technology, to improve student mathematics learning in grades kindergarten (K) to 12 (grades K-12). Using rigorous meta-analytic techniques, they identified conditions under which various types of educational technology programs were more effective in teaching mathematics in grades K-12. The results from the meta-analysis provide researchers and education leaders with up-to-date information on effective uses of technology to improve learning in mathematics.

Project Activities

The researchers conducted a systematic review of experimental and quasi-experimental evaluations of educational technology programs from 2004 to 2024 that measure impacts on students’ mathematics outcomes. The meta-analysis examined (a) the overall effect of educational technology on mathematics learning, (b) the study and research features that moderate effects, and (c) the features of the program that moderate effects.

Structured Abstract

Setting

The systematic review included evaluations of educational technology programs conducted in grades K–12 education settings and locations across the U.S., and in countries whose education systems were comparable to the U.S. such as Canada, countries in Europe, Israel, Australia, and New Zealand.

Sample

The review included evaluations of students in grades K-12, in all types of schools in the U.S. and in countries whose education systems were comparable to the U.S.

Factors

The review included studies that tested a broad range of educational technology programs, including products, applications, and instructional strategies supported by technology to improve mathematics learning. The educational technologies included intelligent tutoring systems; virtual reality; media-infused instruction; or computer-assisted instruction, cooperative learning, games, simulations, inquiry/discovery, or project-based learning. The researchers also examined study conditions that included characteristics of students (such as age and socioeconomic status), instructional focus (arithmetic, geometry, problem solving, etc.), and type of community (urban, suburban, rural; disadvantaged/advantaged). They also examined study design and methodological features, along with specific features of the programs evaluated.

Research design and methods

The researchers developed a database of studies from 2004 through February 2024 using four strategies in order to locate as many studies as possible that might meet the inclusion criteria. First, they conducted  electronic searches of education databases using subject-related keywords focusing on educational technology and mathematics and study-methodology related keywords for experimental and quasi-experimental designs. Then, they conducted searches beyond the electronic databases to include Google Scholar and other Internet search engines, education publisher websites, and third-party evaluator websites. The third strategy included examining both citations from identified studies and previous reviews of educational technology interventions and also studies mentioned in IES reports or in the IES What Works Clearinghouse or that were cited in Education Innovation and Research (EIR) program reports. For the fourth strategy, the researchers contacted the authors of identified articles and publishers of educational technology interventions to help identify any remaining studies that may have been missed.

To be included in the meta-analysis, studies must have been experimental or quasi-experimental evaluations of educational technology programs with impacts on mathematics outcomes studies. Studies also must have a duration of at least 8 weeks. Qualifying studies were accepted from the U.S., Canada, countries in Europe, Israel, Australia, and New Zealand. Studies must have used intent-to-treat designs, meaning that all students in the original sample are included in the final analysis. All coding was done by independent pairs of reviewers, who discussed disagreements and came to consensus, involving other authors if needed.

The researchers conducted the meta-analysis through five key steps: (a) retrieve all potential studies, (b) screen and review studies by pre-set criteria, (c) code data and features of qualified studies, (d) compute effect sizes, and (e) implement statistical analyses.

Control condition

Included studies must have included a business-as-usual or alternative treatment condition.

Key measures

The key outcomes in this project were quantitative measures of mathematics learning. For each study there may have been multiple outcomes in multiple domains of mathematics. The researchers coded all outcomes from each study for inclusion and did not include measures of mathematics learning that were created by the study’s researchers.

Data analytic strategy

The researchers extracted effect sizes from individual studies. If the study did not report effect sizes, the researchers calculated the Cohen’s d effect size and converted them to Hedges’ g to account for small sample bias. The researchers also verified studies’ reported effect sizes. Several studies included multiple acceptable outcome measures (such as formative and summative assessments) and, therefore, would have multiple effect sizes. In such cases, the researchers calculated the weighted average effect size across measures of the same outcome type because effect sizes within a single study are likely to be correlated and not independent. The researchers used Comprehensive Meta- Analysis (CMA) software version 4.0 to calculate effect sizes and conduct meta-analytical tests.

Key outcomes

The main findings of this project are as follows (Morrison, Borokhovski, Bernard, & Slavin, 2024):

  • The use of educational technology programs had a small but significant positive effect on students’ mathematics learning.
  • Educational technology programs were less effective for students from low socio-economic status backgrounds.
  • Study and research design features significantly moderated the impact of educational technology programs on students’ mathematics learning. Specifically, studies that used a quasi-experimental design, were funded by the developer of the intervention, or had smaller sample sizes showed stronger effects. 
  • Specific features of the program also significantly moderated the impact of the education technology program on students’ mathematics learning. Education technology programs that were implemented at moderate levels of intensity (30 to 75 minutes per week), blended programs that included a technology component and a print-based component, or programs that were implemented with high levels of fidelity had stronger effects on students’ mathematics learning. In addition, education technology programs that employed personalization by adapting to learners’ responses showed stronger effects on students’ mathematics learning.

People and institutions involved

Project contributors

Amanda Inns

Co-principal investigator

Stephen Pape

Co-principal investigator

Products and publications

Study registration:

https://osf.io/qhyvz/

Publications:

ERIC Citations:  Find available citations in ERIC for this award here. 

Select Publications:

Morrison, J. R., Borokhovski, E., Bernard, R. M., & Slavin, R. E. (2024). A quantitative synthesis of outcomes of educational technology approaches in K-12 mathematics. Center for Research and Reform in Education, The Johns Hopkins University.  https://jscholarship.library.jhu.edu/handle/1774.2/70118 

Available data:

Data from this project is available here: https://osf.io/qhyvz/.

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

Tags

Education TechnologyMathematicsTeaching

Share

Icon to link to Facebook social media siteIcon to link to X social media siteIcon to link to LinkedIn social media siteIcon to copy link value

Questions about this project?

To answer additional questions about this project or provide feedback, please contact the program officer.

 

You may also like

Zoomed in IES logo
Workshop/Training

Innovation Science for Education Analytics (ISEA)

January 01, 2026
Read More
Zoomed in IES logo
Fact Sheet/Infographic/FAQ

Implementing Stay Interviews as a Teacher Retentio...

Author(s): U.S. Department of Education
Read More
Student using blocks and a computer to learn math concepts.
Blog

Building Fraction Sense in Middle School

October 06, 2025 by IES Staff
Read More
icon-dot-govicon-https icon-quote