Skip Navigation
Effectiveness of Reading and Mathematics Software Products

NCEE 2009-4041
February 2009

Collecting Achievement and Product Usage Data

The study’s analyses rely on data from student test scores. Scores came from two sources. The data collection strategy was to collect district scores to the extent they were available and were consistent with the study’s analytic approach, and for the study to administer its own tests if districts could not provide a fall or spring score (the study used the previous spring scores in place of fall scores if districts could provide them). In first, fourth, and sixth grades, if districts did not administer a standardized test with national norms in a grade level, the study administered a student test in the fall and spring of the 2005-2006 school year. It used the Stanford Achievement Test (version 9) reading battery for first graders, the Stanford Achievement Test (SAT-10) reading battery for fourth graders, and the Stanford Achievement Test (SAT-10) math battery for sixth graders. The study used the Educational Testing Services’ (ETS) End-of-Course Algebra Assessment (1997) for algebra I (which is not administered by districts in the study).

For district tests, in first grade one district provided scores on the Iowa Tests of Basic Skills for fall scores, and another district provided scores on the Stanford Achievement Test for spring scores. For fourth grade, one district provided scores on the Iowa Tests of Basic Skills as fall scores. For sixth grade, one district provided fall scores on the Iowa Tests of Basic Skills and another provided fall and spring scores on the New Mexico Standards Based Assessment. For algebra I, one district provided fall scores on the Iowa Tests of Basic Skills. With the exception of scores on the ETS algebra test, scores were converted to normal curve equivalent (NCE) units to standardize the measures across tests and cohorts. Algebra I scores for the ETS test are reported as percent correct.

Data from product records provided information about usage of the products. Eight of the 10 products included in the study used databases to track the time when each student was logged on. The usage measure reported in the study is actual student logged-on time for a school year, as reported by the product database. Usage by more than one student at a time, such as in a group activity, is counted only for the logged-on student. Time spent doing activities that are related to product use but occur when students are not logged on, such as reading materials related to a computer lesson, is not counted as usage.