Project Activities
Structured Abstract
Setting
Sample
Research design and methods
Control condition
Key measures
Data analytic strategy
Key outcomes
People and institutions involved
IES program contact(s)
Products and publications
ERIC Citations: Find available citations in ERIC for this award here.
Publicly Available Data: Publicly available data can be found at the ETS Github site at this link.
Select Publications:
Book Chapter
Burstein, J. McCaffrey, D.F., Holtzman, S., & Beigman Klebanov, B. (2023). Making sense of college students’ writing achievement and retention with automated writing evaluation. in V. Yaneva, M. Von Davier (Eds). Advancing Natural Language Processing in Educational Assessment (pp. 217-234). Taylor & Francis.
Burstein, J., Riordan, B., & McCaffrey, D. (2020). Expanding automated writing evaluation. In, Yan, D., Rupp, A. A., & Foltz, P. W. (Eds.) Handbook of Automated Scoring: Theory into Practice (pp 329-346). CRC Press. 329-346
Journal Articles Hazelton, L., Nastal, J., Elliot, N., Burstein, J. & McCaffrey, D. (2021). Formative automated writing evaluation: A standpoint theory of action. Journal of Response to Writing, 7(1), 37-91.
Ling, G., Elliot, N., Burstein, J. C., McCaffrey, D. F., MacArthur, C. A., & Holtzman, S. (2021). Writing motivation: A validation study of self-judgment and performance. Assessing Writing, 48, 100509.
McCaffrey, D.F., Zhang, M., & Burstein, J (2022). Across performance contexts: Using automated writing evaluation to explore student writing. The Journal of Writing Analytics, 6, , 167-199. DOI: 10.37514/JWA-J.2022.6.1.07
Oddis, K., Burstein, J., Holtzman, S., & McCaffrey, D.F. (2022). A framework for analyzing features of writing curriculum in studies of student writing achievement. The Journal of Writing Analytics, 6, 95-144. DOI: 10.37514/JWA-J.2022.6.1.05
Proceedings
Burstein, J., McCaffrey, D., Elliot, N., Beigman Klebanov, B., Molloy, H., Houghton, P. & Mladineo, Z. (2020). Exploring writing achievement and genre in postsecondary writing . In Companion Proceedings in the 10th International Conference on Learning Analytics & Knowledge (LAK20), pp 53-55. Full text
Burstein, J., McCaffrey, D., Beigman Klebanov, B., & Ling, G. (2017). Exploring relationships between writing and broader outcomes with automated writing evaluation. In Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pp. 101-108. Full Text
Burstein, J., McCaffrey, D., Beigman Klebanov, B., Ling, G., & Holtzman, S. (2019). Exploring writing analytics and postsecondary success indicators. In Companion Proceedings 9th International Conference on Learning Analytics & Knowledge (LAK19), pp. 213-214. Full Text
McCaffrey, D., Holtzman, S., Burstein, J., and Beigman Klebanov, B. (2021). What can we learn about college retention from student writing? To appear in Companion Proceedings in the 11th International Conference on Learning Analytics & Knowledge (LAK21).
Supplemental information
Co-Principal Investigator: McCaffrey, Daniel
- Different aspects of the writing construct are captured in different contexts (assessment vs coursework writing) (McCaffrey et al., 2022). This informs our knowledge about how we think about evaluating student writing abilities.
- Coursework writing in different genres provides students with opportunities to practice different aspects of the writing construct. Limiting opportunities to write in different genres hinders development of their writing domain knowledge (Burstein, et al,2020).
- Writing domain knowledge predicts student general knowledge (critical thinking outcomes), academic outcomes (assessment scores, college grade point average), and correlate with success predictors (such as SAT/ACT scores) (Burstein, et al., 2019).
- Intrapersonal factors (such as attitudes) are associated with writing ability and outcomes, suggesting that writing curricula should consider the importance of intrapersonal factors (Ling et al., 2021).
- Survival analyses revealed relationships between vocabulary usage and student retention. Specifically, greater use of personally reflective language in coursework and assessment writing increased the likelihood of dropout. More sophisticated and varied use of vocabulary reduced the likelihood of dropout (Burstein et al. 2023, McCaffrey et al., 2021).
- Coursework writing was shown to be a potential source for assessing malleable factors using automated writing evaluation computer algorithms. The data might be sensitive to the course assignments. The project team developed a rubric for evaluating the quality of assignments and pilot tested the rubric using the limited assignment materials the project team collected. The assignment scores were generally low. The rubric could support future studies of coursework writing (Oddis et al., 2022).
- Conventions (grammar, spelling, citations, etc.)
- Coherence (topic development, topical cohesion, etc.)
- Organization (thesis sentences, use of rhetorical and discourse markers, such as therefore, however)
- Source use and integration (appropriate use of citations, etc.)
- Topicality (relevant vocabulary, etc.)
Students' skill in these constructs may vary depending on student background characteristics. For example, some students may have deficits in some, but not all, constructs or be weak or strong across the board. In terms of domain-general knowledge, the researchers explored general critical thinking skills that may correlate with writing ability. In terms of intrapersonal factors, they explored whether a student's level of engagement, goal setting, interest, motivation, and self-efficacy in writing correlated with writing ability. They did not explore relationships between writing achievement constructs and student demographic backgrounds.
Questions about this project?
To answer additional questions about this project or provide feedback, please contact the program officer.