Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Efficacy of ASSISTments Online Homework Support for Middle School Mathematics Learning: A Replication Study
Center: NCER Year: 2017
Principal Investigator: Feng, Mingyu Awardee: WestEd
Program: Teaching, Teachers, and the Education Workforce      [Program Details]
Award Period: 4 years (09/01/2017 - 08/31/2021) Award Amount: $3,298,853
Type: Efficacy and Replication Award Number: R305A170641

Previous Award Number: R305A170137
Previous Awardee: SRI International

Co-Principal Investigators: Heffernan, Neil; Roschelle, Jeremy; Murphy, Robert

Purpose: The purpose of the project was to conduct a replication study of the impact of a fully developed, widely adopted intervention called ASSISTments on student mathematics outcomes. Findings from a previous IES-funded efficacy study, conducted in Maine, indicated this intervention led to beneficial impacts on student-learning outcomes. The current study examined the impacts of this intervention with a more diverse sample and relied on mathematics coaches (instead of the intervention developers) to provide professional development and support to teachers. These differences were important to policymakers and educators who wanted to know whether interventions work, whether they contribute to closing achievement gaps, and whether they can be supported sufficiently via local teacher professional development and local implementation conditions and practices.

Project Activities: This intervention included (1) a web-based platform that provides support to students as they solve mathematics problems from the district-assigned textbooks and provides detailed student-level and class-level formative assessment data to teachers to help inform adjustments in classroom instruction and pacing and (2) professional development to support teacher use of the reports as a formative assessment tool. Schools were randomly assigned to either the treatment or control condition and participated for 2 full school years. Data on implementation fidelity and contrast between conditions were collected in both years.

Key Outcomes: Key outcomes will be included once findings from the replication study are published.

Structured Abstract

Setting: Participating schools were located in rural, suburban, and urban districts across North Carolina. The sample included low-performing schools and students from lower-socioeconomic status backgrounds.

Sample: The project originally recruited 63 schools from 41 different districts into the sample and randomly assigned to condition (32 treatment schools and 31 control schools). The schools were distributed across rural, town, suburban, and city communities (33 rural, 11 town, 8 suburban, and 11 city). Of the 63 schools, 18 were charter schools, 45 were public schools, and 48 of the schools received Title I funding. The original student sample included 4,958 students (2964 treatment and 2264 control).

Intervention: ASSISTments is a web-based platform that provides support to students as they solve mathematics problems pulled from district-assigned textbooks and teacher-created worksheets and that provides detailed student-level and class-level formative assessment data to teachers to help inform adjustments in classroom instruction and pacing. ASSISTments also provides teachers with access to pre-organized problem sets, called Skill Builders, which teachers can assign to students. Skill Builders provide students with additional practice, hints, and guidance as they work through problems and immediate feedback to answers.

Research Design and Methods: The study used a school-level, delayed-treatment, clustered randomized experimental design. Schools were randomly assigned to either the treatment or control condition and all teachers in grade 7 participated in the same condition. Schools were asked to participate for 2 school years (2018-19 and 2019-20). Data on implementation fidelity and contrast between conditions were collected in both years. In the treatment schools, teachers and students in grade 7 used the ASSISTments to support their homework assignment and review practices. Teachers assigned homework online and receive diagnostic reports to facilitate their review of homework and adapt their instruction accordingly. Students completed their homework on their computers and received (a) immediate feedback on their answers to textbook problems, (b) hint messages to help solve difficult problems, (c) Skill Builders that track and adjust to their mastery status of knowledge, and (d) automatic reassessment to help improve their retention of previous mastered skills. Teachers received professional development and ongoing support to support their use of the reports as a formative assessment tool. A local coach conducted on average three classroom visits in each school year.

Control Condition: In the control condition, teachers continued with their existing instructional practices and technologies (other than ASSISTments). Teachers who did not have a formal homework policy were recommended to assign a minimum of 20 minutes of homework per night for at least 3 nights a week.

Key Measures: The North Carolina End-of-Grade assessment (EoG) was originally the primary measure of student math achievement. Due to COVID-19, the EoG test was canceled in Spring 2020, and the team administered an online the grade 8 math readiness test as a supplemental measure. Teacher logs were used to measure teacher homework review practices. Other measures include eligibility for free or reduced-price lunch as a measure of socioeconomic status (SES), school size, rural or non-rural status of school, individualized education plan status, and English proficiency status.

Data Analytic Strategy: The team used a three-level hierarchical linear regression model (HLM) to examine mean differences in scores of the grade 8 math readiness test between students in two conditions, controlling for prior achievement and other covariates. Researchers also completed moderator analyses, examining whether ASSISTments had differential impact on students with different characteristics. The student characteristics investigated included gender, grade 6 math achievement, disability status, and economically disadvantaged status. Two-level HLM analyses were conducted to examine the effect of the program on teacher practice in performing targeted homework review. Finally, researchers conducted exploratory analyses to examine the link between student learning outcomes and student and teacher use of the ASSISTments, using a three-level hierarchical linear regression model with student, classroom/teacher, and school levels.

Related IES Projects: Using Web-based Cognitive Assessment Systems for Predicting Student Performance on State Exams (R305K030140), Making Longitudinal Web-Based Assessments Give Cognitively Diagnostic Reports to Teachers, Parents, and Students While Employing Mastery Learning (R305A070440), An Efficacy Study of Online Mathematics Homework Support: An Evaluation of the ASSISTments Formative Assessment and Tutoring Platform (R305A120125), Evaluating the Effectiveness of ASSISTments for Improving Math Achievement (R305A170243)


ERIC Citations: Find available citations in ERIC for this award here and here.

Publicly Available Data: Study data have been deposited in the Open ICPSR (, including participant research ID, randomization condition, ASSISTments usage data, teacher log/survey data, and student MDTP assessment data. Data-sharing agreements between researchers and the state did not allow state administrative data including school enrollment, student demographic, and student state test data to be shared with third parties.

Additional Online Resources and Information: WPI and the ASSISTments Foundation (TAF) hosted webinars for teachers and created a library of recorded webinars ( The project produced blogs ( New intervention materials were posted on the developer’s website (e.g., ASSISTments Advantage (

Select Publications:


Baral, S., Botelho, A., Erickson, J.A., Benachamardi, P., & Heffernan, N. (2021). Improving Automated Scoring of Student Open Responses in Mathematics. In Hsiao, Sahebi, Bouchet & Vie (eds). Proceedings of the 14th International Conference on Educational Data Mining (EDM2021). pp 130-138.

Gillespie, J., Winn, K., Faber, M. & Hunt, J. (2022). Implementation of a Mathematics Formative Assessment Online Tool Before and During Remote Learning. In Proceedings of the 23rd International AI in Education (AIED) Conference. pp 168–173. Duram, UK. July 2022.

Patikorn, T. & Heffernan, N. T. (2020, August 12) Effectiveness of Crowd-Sourcing On-Demand Tutoring from Teachers in Online Learning Platforms. Proceedings of the Seventh ACM Conference on Learning @ Scale (L@S). pp 115–124.

Prihar, E., Haim, A., Sales, A., & Heffernan, N. (2022). Automatic Interpretable Personalized Learning. Proceedings of the Ninth ACM Conference on Learning @ Scale (L@S ’22), June 1–3, 2022, New York City, NY, USA.

Prihar, E., Patikorn, T., Botelho, A., Sales, A., & Heffernan, N. (2021). Toward Personalizing Students' Education with Crowdsourced Tutoring. Learning@Scale 2021. pp 37–45

Prihar, E., Syed, M., Ostrow, K., Shaw, S., Sales, A., & Heffernan, N. (2022). Exploring Common Trends in Online Educational Experiments. In Proceedings of the 15th International Educational Data Mining Conference, Durham, England.

Shen, J.T., Yamashita, M., Prihar, E., Heffernan, N., Wu, X., McGrew, S., & Lee, D. (2021). Classifying Math Knowledge Components via Task-Adaptive Pre-Trained BERT. 22nd International Conference on Artificial Intelligence in Education. pp 408- 419.

Zhang, M., Baral, S., Heffernan, N. & Lan, A. (2022). Automatic Short Math Answer Grading via In-context Meta-learning. In Proceedings of the 15th International Conference on Educational Data Mining (EDM2022).