Project Activities
To build the assessment system, the research team (a) developed assessment content and items; (b) built the interface, the speech recognition system, and the Automated Scoring System; (c) developed a psychometric model that accurately estimates vocabulary item parameter and student vocabulary abilities; (d) carried out three validation studies, and (e) completed a cost analysis.
Structured Abstract
Setting
The assessment was tested in second- and third-grade classrooms in Texas in urban and semi-urban settings.
Sample
In total, 2,224 Hispanic (Latinx) English Learners from six Texas school districts participated in the study. The validation year sample (i.e., Year 5) included 272 students from 49 classrooms enrolled in either English-only or bilingual programs across two districts, one of which was a charter school system. Of these 272 students, 83 attended two English-only charter schools, and 189 attended eight public schools implementing a one-way bilingual program in which Spanish native speakers received mathematics and social studies instruction in English and reading, writing, and science instruction in Spanish.
In the MELVA-S assessment, the system asks students to define a target word and use the word in a sentence that matches a picture in a prompt. Students provide responses orally and will receive a score for the definition of the word and a score for the use of the word in a sentence.
Research design and methods
Words included in the assessment were selected and categorized following a specific criteria and taking into account the Common Core State Standards, the Texas Evaluation of Knowledge and Skills standards, and the Next Generation Science standards. To assemble equivalent forms, the team conducted an equating study in Year 3. During the equating study, participating students were randomly assigned to complete 4 forms, for a total of 12 forms across three time points.
Key measures
Alternate forms of the MELVA assessment, the IOWA science assessment test, the easyCBM vocabulary subtest, and the Texas English Language Proficiency Assessment.
Data analytic strategy
To develop the Automated Scoring System, transcribed student responses were coded and analyzed using regression techniques, rule-based classification and supervised classification. To create the equated alternative forms, methodologists examined the level of difficulty of the science words based on student responses. Once the level of word difficulty was determined, 15 alternate forms of MELVA-S were created with similar levels of difficulty. Each form included six unique items and six items shared with two additional forms creating a fully linked network. All items were estimated simultaneously under a unidimensional Partial Credit Model. To gather validity evidence (namely, construct, criterion, and predictive validity), the methodology team used Pearson product-moment correlations, regressions, growth models, and growth mixture models.
Cost analysis strategy
We completed a cost analysis that considered all potential expenditures for the MELVA-S administration and scoring. This included personnel, facilities, equipment, materials, and training. This work demonstrated that MELVA-S was a cost-efficient measure to track students’ vocabulary growth.
Key outcomes
The main findings of this project, as reported by the principal investigator in peer-reviewed publications, are as follows:
- MELVA-S has a moderate significant relation with the IOWA Science Assessment providing teachers timely information on student science knowledge.
- MELVA-S is sensitive to capture differences in language development by grade and proficiency
- Results from the MELVA-S assessments can be used to progress monitor students, help teachers differentiate language and vocabulary instruction around science topics, and provide additional science vocabulary supports within a Response to Intervention approach.
- The Automated Scoring System was automatically scored by an algorithm designed and trained by our team that showcased 80% accuracy to human scoring.
People and institutions involved
IES program contact(s)
Project contributors
Products and publications
This project produced an online formative assessment system - the Measuring the English Language Vocabulary Acquisition of Latinx Bilingual Students (MELVA-S). The team produced 24 equivalent alternate forms of MELVA that can be used by teachers to assess their student initial status and growth of vocabulary knowledge and a preliminary report that can be used by teachers to differentiate instruction and provide additional vocabulary and language development support around science topics. The researchers also produced peer-reviewed publications and will disseminate their findings via conference presentations.
Project website:
Publications:
ERIC Citations: Find available citations in ERIC for this award here.
Arizmendi, G.D., Palma, J., Baker, D. (2025). Predicting science and social studies vocabulary learning in Spanish English bilingual children. Language, Speech, and Hearing Services in the Schools. https://doi.org/10.1044/2025_LSHSS-24-00045.
Baker, D. L., Moradibavi, S., Liu, Y., Huang, Y., & Sha, H. (2025). Effects of Interventions on Science Vocabulary and Content Knowledge: A Meta-analysis. Research in Science Education, 1-19.
Kowalkowski, H. P., Palma, J., Herrera, C., Baker, D. L. Wu, Z., & Larson, E. (2025). Advancing Formative Assessment: Using Natural Language Processing Within a Sociocultural Context to Measure Multilingual Student Science Word Knowledge. Education Sciences, 15(12), 1668.
Kowalkowski, H., Sha, H. S, Moradibavi, S. S, & Baker, D. L. (2025). Navigating the Science Education Landscape: Teacher Beliefs About Supporting Multilingual Students. Journal of Multilingual and Multicultural Development. 1-19.
Wu, Z., Larson, E., Sano, M., Baker, D., Gage, N., & Kamata, A. (2023, July). Towards Scalable Vocabulary Acquisition Assessment with BERT. In Proceedings of the Tenth ACM Conference on Learning@ Scale (pp. 272-276).
Additional project information
Additional online resources and information:
Baker, D. L., Kowalkowski, H. P., Telleache, A., Moradibavi, S., Sha, H., de Souza, S., Russell, S., & Conde-Holman, G. (2025). The MELVA-S science dictionary. The Meadows Center for Preventing Educational Risk. Retrieved from https://meadowscenter.org/resource/melva-s-science-dictionary/
de Souza, S., Kowalkowski, H. K., & Baker, D. L. (2025). MELVA-S Frayer model guide for science. The Meadows Center for Preventing Educational Risk. Retrieved from https://meadowscenter.org/resource/frayer-model-resources/
Previous award details:
Related projects
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.