Skip Navigation

Effect of Linguistic Modification of Math Assessment Items on English Language Learner StudentsEffect of Linguistic Modification of Math Assessment Items on English Language Learner Students

Analysis plan

Various analyses are planned: item-level descriptive, variance, differential item functioning, factor structure of the tests, and test correlation.

Item-level descriptive analyses. Item-level statistics—including frequency distributions for item choices, p-values, standard deviations, point biserial correlations, and omission rates—are generated from pilot and operational item sets.

Analysis of variance. Scores on item sets, disaggregated by group (English language learner or non–English language learner students) and item sets (linguistically modified or unmodified), provide information on how each group performed on each item set. A three-factor analysis of variance tests mean differences in scores for the two student groups, to examine whether English language learner students can better demonstrate their mathematics ability on the linguistically modified item set. The three factors are item set (linguistically modified or unmodified), student population (English language learner and non–English language learner students), and grade (7 or 8). This analysis should help answer the primary research question.

If linguistic modification provides English language learner students greater access to the mathematics content, then the score difference between the linguistically modified and unmodified item sets should be greater for the English language learner population than for the non–English language learner population. The interaction between student population and item set is of particular interest in this analysis because it addresses this hypothesis.

Analysis of variance is also used to examine whether there are score differences between non–English language learner students at low reading levels (below proficient in English language arts) and non–English language learner students at high reading levels (proficient and above in English language arts). The expectation is that if linguistic modification reduces the language burden, the score difference in item sets will be greater for the low-level readers than for the high-level readers. This analysis should help answer research question 2. If there is a performance difference between item sets—that does not vary by reading group for non–English language learner students—this may indicate that the modification has increased student access to mathematics as well as language.

Differential item functioning analysis. An analysis of differential item functioning addresses whether the chance of a student answering an unmodified item correctly is greater in the non–English language learner population than in the English language learner population, even after controlling for total item set score. In general, exhibiting differential item functioning may indicate the multidimensionality of the item. That is, there could be another construct—other than the target achievement construct assessed by the items in the analysis—associated with group membership that is contributing to performance on the item. Items showing differential item functioning are examined closely, along with information about the item obtained from the factor analyses, to explain the differential item functioning. Such analyses should help answer research question 3.

Factor structure of the item sets. For each operational item set, exploratory factor analyses estimate the number of constructs assessed by the item sets and their underlying measurement structure (correlations). The results from these analyses are the foundation for further nested confirmatory factor analyses. Testing for differences in measurement structure across student groups and item sets, these confirmatory factor analyses should reveal the effects of linguistic modification, and thus the degree of access to test content, on the dimensionality of the item set.

For each item set the researchers examine the correlation of item parcels with latent factors as well as the correlations between latent factors (defined through the exploratory factor and item content analyses) and English language learner and non–English language learner students. Researchers anticipate that the item loadings for the non–English language learner students on both item sets will be higher than for the English language learner students, the correlations between latent factors will be higher for the non–English language learner students than for English language learner students, and the gap between these student groups will narrow on the linguistically modified item set. These analyses are intended to help answer research question 4.

To explore the factors accounting for these differences, another construct-irrelevant latent factor is incorporated in the model. This latent factor, which may be labeled student verbal ability, may also affect students' performance on a math test, especially for English language learner students (Abedi, Leon, and Mirocha 2003). If examining the correlations of item parcels with the linguistic latent factor supports this hypothesis, the item parcel correlations might be higher for English language learner students than for non–English language learner students, regardless of item set. However, these differences should be less pronounced on the linguistically modified item set.

Test correlations. Analyses of data from school records or district databases will provide information on the relationship between performance on statewide tests of mathematics and on the two item sets (linguistically modified and unmodified). The researchers hypothesize that linguistic modification should not alter the mathematics construct assessed, as shown by the strong correlation of level of performance on the study item set to level of performance on a standardized test of mathematics achievement.

Return to Index