Skip Navigation

Cognition and Student Learning


- OR -


- OR -


- OR -

FY Awards

- OR -

Bridging the Gap: Applying Algebra Cognition Research to Develop and Validate Diagnostic Classroom Algebra Testlet

Year: 2004
Name of Institution:
Boston College
Goal: Measurement
Principal Investigator:
Russell, Michael
Award Amount: $1,042,561
Award Period: 3 years
Award Number: R305H040099


Co-Principal Investigator(s): Lucariello, Joan

Purpose: In this project, the researchers proposed to develop a set of computer-based algebra short tests, called testlets, that can provide teachers with information that they can use to guide instruction for individual students and groups of students who share similar problems with algebraic concepts. The researchers proposed that this tool would make classroom assessment a more instructionally useful component of teaching and learning. During the early 2000s, the tests used for mathematics achievement provided information about whether students solve a problem correctly but rarely provided information about how students solved problems or about what kinds of misconceptions might have led to choosing the wrong answer. Consequently, the tests provided little guidance to teachers about how to focus their teaching to help students overcome their misunderstanding. The testlets the researchers proposed were to provide information about students' misconceptions about or understanding of specific algebraic concepts, thus enabling teachers to use that information to guide their instruction of those students.

Structured Abstract


In the first phase of this project, the researchers are developing a series of short testlets designed to identify students' abilities to solve particular types of algebra problems and whether they possess various specific algebraic misconceptions that have been identified by previous research as being fairly common. Phase 2 studies will examine whether testlets identify the misconceptions they are designed to identify. Multiple methods will be used in a series of studies involving eighth and ninth grade algebra students, including studies comparing testlet performance with predictions of teachers and with student classroom work, and pretest-posttest studies examining the effects of targeted feedback or instruction on testlet performance. Among these will be a series of two group studies (one for each targeted algebraic misconception) in which 20 students who are identified with the misconception based on testlet performance will be randomly assigned to one of two groups (Group A or B). Students in Group A will work with an individual tutor who will provide instruction over a short period of time; students in Group B will not receive any instruction during this time. After the instructional intervention for students in Group A is complete, all students Groups A and B will retake the testlet. Students in Group B will then work individually with a tutor and all students will be tested for a third time. Changes in students' testlet performance will then be compared between the two groups.

The final phase of the project will involve an examination of whether teachers can use testlets to identify misconceptions of their eighth and ninth grade students, and to respond appropriately. Researchers will instruct teachers in how to use the testlets, after which pretest-posttest gains in testlet performance will be compared across four groups of students: 1) those who do not demonstrate a specific misconception, 2) those who have the misconception and who receive no targeted instruction, 3) those who have the misconception and receive inappropriately-targeted instruction (i.e., targeted at the wrong misconception), and 4) those who have the misconception and receive appropriately-targeted instruction. The prediction is that students in the fourth group will show greater pretest to posttest improvement in testlet performance than will students in the other three groups.

Related IES Projects: The Diagnostic Geometry Assessment Project (R305A080698), Diagnostic Inventories of Cognition in Education (DICE) (R305A170441)

Products and Publications

ERIC Citations: Find available citations in ERIC for this award here.

Project Website:

Select Publications:

Journal articles

Higgins, J., Patterson, M.B., Bozman, M., and Katz, M. (2010). Examining the Feasibility and Effect of Transitioning GED Tests to Computer. Journal of Technology, Learning, and Assessment, 10(2): 1–33.

Lucariello, J., Tine, M.T., and Ganley, C.M. (2014). A Formative Assessment of Students' Algebraic Variable Misconceptions. The Journal of Mathematical Behavior, 33: 30–41.

Masters, J. (2010). Automated Scoring of an Interactive Geometry Item: A Proof-of-Concept. Journal of Technology, Learning and Assessment, 8(7): 1–39.

Russell, M., Higgins, J., and Hoffmann, T. (2009). Meeting the Needs of all Students: A Universal Design Approach to Computer-Based Testing. Innovate: Journal of Online Education, 5(4): 1–6.

Russell, M., Hoffmann, T., and Higgins, J. (2009). NimbleTools: A Universally Designed Test Delivery System. Teaching Exceptional Children, 42(2): 6–12.

Russell, M., O'Dwyer, L.M., and Miranda, H. (2009). Diagnosing Students' Misconceptions in Algebra: Results From an Experimental Pilot Study. Behavior Research Methods, 41(2): 414–424.